Self-Driving Car Engineer Nanodegree

Deep Learning

Project: Build a Traffic Sign Recognition Classifier

In this notebook, a template is provided for you to implement your functionality in stages which is required to successfully complete this project. If additional code is required that cannot be included in the notebook, be sure that the Python code is successfully imported and included in your submission, if necessary. Sections that begin with 'Implementation' in the header indicate where you should begin your implementation for your project. Note that some sections of implementation are optional, and will be marked with 'Optional' in the header.

In addition to implementing code, there will be questions that you must answer which relate to the project and your implementation. Each section where you will answer a question is preceded by a 'Question' header. Carefully read each question and provide thorough answers in the following text boxes that begin with 'Answer:'. Your project submission will be evaluated based on your answers to each of the questions and the implementation you provide.

Note: Code and Markdown cells can be executed using the Shift + Enter keyboard shortcut. In addition, Markdown cells can be edited by typically double-clicking the cell to enter edit mode.


Step 1: Dataset Exploration

Visualize the German Traffic Signs Dataset. This is open ended, some suggestions include: plotting traffic signs images, plotting the count of each sign, etc. Be creative!

The pickled data is a dictionary with 4 key/value pairs:

  • features -> the images pixel values, (width, height, channels)
  • labels -> the label of the traffic sign
  • sizes -> the original width and height of the image, (width, height)
  • coords -> coordinates of a bounding box around the sign in the image, (x1, y1, x2, y2). Based the original image (not the resized version).
In [1]:
# Load pickled data
# TODO: fill this in based on where you saved the training and testing data
import pickle

training_file = '..//..//..//data//traffic-signs-data//train.p'
testing_file = '..//..//..//data//traffic-signs-data//test.p'

with open(training_file, mode='rb') as f:
    train = pickle.load(f)
with open(testing_file, mode='rb') as f:
    test = pickle.load(f)
    
X_train, y_train = train['features'], train['labels']
X_test, y_test = test['features'], test['labels']
In [2]:
### To start off let's do a basic data summary.

# TODO: number of training examples
n_train = X_train.shape[0]

# TODO: number of testing examples
n_test = X_test.shape[0]

# TODO: what's the shape of an image?
image_shape = X_train[0].shape

# TODO: how many classes are in the dataset
n_classes = len(set(y_train))

print("Number of training examples =", n_train)
print("Number of testing examples =", n_test)
print("Image data shape =", image_shape)
print("Number of classes =", n_classes)
Number of training examples = 39209
Number of testing examples = 12630
Image data shape = (32, 32, 3)
Number of classes = 43
In [594]:
def show_category(data, category, counts):
    first_img = []
    i = 1
    col = 5
    row = int(np.ceil(counts/col))
    plt.figure(figsize=(15,30))
    for k, group in groupby(enumerate(y_train), lambda pair: pair[1]):
        if k == category:
            for index, cat in group:
                plt.subplot(row, col, i)
                plt.imshow(data[index,:,:,:])
                plt.title("{0}".format(index))
                plt.axis('off')
                i += 1
                if i > counts:
                    break

    plt.show()
In [646]:
### Data exploration visualization goes here.
### Feel free to use as many code cells as needed.
import matplotlib.pyplot as plt
import pandas as pd
from itertools import groupby

sign_name_fn = './signnames.csv'
sign_names = np.array(pd.read_csv(sign_name_fn, delimiter=',', header=0))
sign_names = dict(sign_names)

def show_sample_by_group(data, label, signames):
    first_img = []
    for k, group in groupby(enumerate(label), lambda pair: pair[1]):
        first_img.append((k, list(group)[0][0]))
        #print(key, group)
        
    col = 4
    row = int(np.ceil(len(first_img) / col))
    plt.figure(figsize = (15, 30))
    
    for i, (k, j) in enumerate(first_img):
        plt.subplot(row, col, i+1)
        plt.axis('off')
        plt.title("{0:2d} {1:.32}".format(k, signames[k]))
        plt.imshow(data[j])

    plt.show()

show_sample_by_group(X_train, y_train, sign_names)

Step 2: Design and Test a Model Architecture

Design and implement a deep learning model that learns to recognize traffic signs. Train and test your model on the German Traffic Sign Dataset.

There are various aspects to consider when thinking about this problem:

  • Your model can be derived from a deep feedforward net or a deep convolutional network.
  • Play around preprocessing techniques (normalization, rgb to grayscale, etc)
  • Number of examples per label (some have more than others).
  • Generate fake data.

Here is an example of a published baseline model on this problem. It's not required to be familiar with the approach used in the paper but, it's good practice to try to read papers like these.

Implementation

Use the code cell (or multiple code cells, if necessary) to implement the first step of your project. Once you have completed your implementation and are satisfied with the results, be sure to thoroughly answer the questions that follow.

In [547]:
### Preprocess the data here.
### Feel free to use as many code cells as needed.
import numpy as np

def show_sample_distribution(label):
    plt.hist(label, bins=len(set(y_train)))
    plt.show()

show_sample_distribution(y_train)

#Generate fake data for fewer class
k_c_map = {}
for key, group in groupby(y_train):
    k_c_map[key] = len(list(group))

#Zero Center
n, w, h, c = X_train.shape
mean_image = np.mean(X_train, axis=0)
X_train_norm = X_train - mean_image
X_test_norm = X_test - mean_image

X_train_norm /= 255
X_test_norm /= 255

def preprocess(data):
    data -= mean_image
    data /= 255
    return data

plt.hist(X_train_norm.flatten())
plt.show()

Question 1

Describe the techniques used to preprocess the data.

Answer:

  1. Zero mean; Substract the mean of training image
  2. Normalize: Divide by the maximum value (255) of training image
In [5]:
### Generate data additional (if you want to!)
### and split the data into training/validation/testing sets here.
y_train_ohe = np.zeros([n_train, n_classes])
y_train_ohe[np.arange(n_train), y_train] = 1

y_test_ohe = np.zeros([n_test, n_classes])
y_test_ohe[np.arange(n_test), y_test] = 1

indices = list(range(n_train))
np.random.shuffle(indices)
n_train_t = int(0.8*n)
X_train_norm_t = X_train_norm[indices[:n_train_t]]
X_train_norm_v = X_train_norm[indices[n_train_t:]]

y_train_ohe_t = y_train_ohe[indices[:n_train_t]]
y_train_ohe_v = y_train_ohe[indices[n_train_t:]] 

y_train_t = y_train[indices[:n_train_t]]
y_train_v = y_train[indices[n_train_t:]]
### Feel free to use as many code cells as needed.
print("training data:", X_train_norm_t.shape)
print("validation data:", X_train_norm_v.shape)
print("test data:", X_test.shape)

num_train = X_train_norm_t.shape[0]
num_val = X_train_norm_v.shape[0]
num_test = X_test.shape[0]
training data: (31367, 32, 32, 3)
validation data: (7842, 32, 32, 3)
test data: (12630, 32, 32, 3)

Question 2

Describe how you set up the training, validation and testing data for your model. If you generated additional data, why?

Answer:

  1. Seperate training/validation data randomly by 80:20 from original training dataset
  2. Keep original test dataset as test data
  3. The original data is imbalance, maybe generate additional data on rarely class is value to try.
In [6]:
### Define your architecture here.
### Feel free to use as many code cells as needed.
import tensorflow as tf
import numpy as np
import tensorflow.contrib.slim as slim

def DenseBlock(data, layer_i, bottleneck_layers):
    with tf.variable_scope("dense_w"+str(layer_i)):
        nodes = []
        a = slim.conv2d(data,64,[3,3],normalizer_fn=slim.batch_norm)
        nodes.append(a)
        for z in range(bottleneck_layers):
            b = slim.conv2d(tf.concat(3,nodes),64,[3,3],normalizer_fn=slim.batch_norm)
            nodes.append(b)
        return b

tf.reset_default_graph()

dense_layers = 25
dense_blocks = 5
bottleneck_layers = int(dense_layers / dense_blocks)

input_layer = tf.placeholder(shape=[None,32,32,3],dtype=tf.float32,name='input')
label_layer = tf.placeholder(shape=[None],dtype=tf.int32)
label_oh = slim.layers.one_hot_encoding(label_layer, n_classes)

layer1 = slim.conv2d(input_layer, 64,[3,3],normalizer_fn=slim.batch_norm,scope='w_'+str(0))
for i in range(dense_blocks):
    layer1 = DenseBlock(layer1, i, bottleneck_layers)
    layer1 = slim.conv2d(layer1, 64, [3,3], stride=[2,2], normalizer_fn=slim.batch_norm, scope='w_s_'+str(i))
    
top = slim.conv2d(layer1, n_classes, [3,3], normalizer_fn=slim.batch_norm,activation_fn=None,scope='w_top')
output = slim.layers.softmax(slim.layers.flatten(top))
loss = tf.reduce_mean(-tf.reduce_sum(label_oh * tf.log(output) + 1e-10, reduction_indices=[1]))

Question 3

What does your final architecture look like? (Type of model, layers, sizes, connectivity, etc.) For reference on how to build a deep neural network using TensorFlow, see Deep Neural Network in TensorFlow from the classroom.

Answer: I choice "DenseNet" architecture (https://arxiv.org/pdf/1608.06993v3.pdf) because it got the best performance than others (ex. VGG like) on validation dataset. The DenseNet have 5 dense blocks, there are 5 layers within each dense block. All convolution layers have batch normalization.

In [9]:
BATCH_SIZE = 64
EPOCHS = 200

def sample_indices(data, size):
    indices = list(range(data.shape[0]))
    np.random.shuffle(indices)
    return indices[:size]

trainer = tf.train.AdamOptimizer(learning_rate=0.001)
update = trainer.minimize(loss)

pred_correct = tf.equal(tf.argmax(output, 1), tf.argmax(label_oh, 1))
acc_op = tf.reduce_mean(tf.cast(pred_correct, tf.float32))


with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    steps = num_train // BATCH_SIZE
    num_samples = steps * BATCH_SIZE    
    total_acc, total_loss = 0, 0
    saver = tf.train.Saver()
    for i in range(EPOCHS):
        for step in range(steps):
            indices = sample_indices(X_train, BATCH_SIZE)
            batch_x = X_train_norm[indices,:,:,:]
            batch_y = y_train[indices]
            #batch_y = y_train_t[indices]
            #print(batch_x.shape, batch_y.shape)
            loss_, acc_, _ = sess.run([loss, acc_op, update], feed_dict={input_layer: batch_x, label_layer: batch_y})
            if step % 100 == 0:
                print("epoch:{0}-step:{1}:loss:{2}, acc:{3}".format(i, step, loss_, acc_))    
        
        if i % 10 == 0:
            #print("epoch:{0}:loss_v:{1},acc_v:{2}".format(i, loss_v, acc_v))
            saver.save(sess, "model3.ckpt")
epoch:0-step:0:loss:4.154956817626953, acc:0.046875
epoch:0-step:100:loss:2.6544742584228516, acc:0.28125
epoch:0-step:200:loss:1.9725388288497925, acc:0.484375
epoch:0-step:300:loss:1.6187902688980103, acc:0.59375
epoch:0-step:400:loss:1.1927974224090576, acc:0.765625
epoch:1-step:0:loss:1.0967597961425781, acc:0.828125
epoch:1-step:100:loss:0.8642300367355347, acc:0.84375
epoch:1-step:200:loss:0.5915610194206238, acc:0.921875
epoch:1-step:300:loss:0.582996129989624, acc:0.96875
epoch:1-step:400:loss:0.5438264608383179, acc:0.9375
epoch:2-step:0:loss:0.5313405394554138, acc:0.921875
epoch:2-step:100:loss:0.6850073933601379, acc:0.9375
epoch:2-step:200:loss:0.36295777559280396, acc:0.96875
epoch:2-step:300:loss:0.3993617594242096, acc:0.984375
epoch:2-step:400:loss:0.3694796562194824, acc:1.0
epoch:3-step:0:loss:0.49878865480422974, acc:0.953125
epoch:3-step:100:loss:0.5064420700073242, acc:0.984375
epoch:3-step:200:loss:0.5009831786155701, acc:0.9375
epoch:3-step:300:loss:0.45756906270980835, acc:0.953125
epoch:3-step:400:loss:0.5205979943275452, acc:0.984375
epoch:4-step:0:loss:0.4264024794101715, acc:0.96875
epoch:4-step:100:loss:0.34478604793548584, acc:0.96875
epoch:4-step:200:loss:0.4224657416343689, acc:0.96875
epoch:4-step:300:loss:0.3829379677772522, acc:0.984375
epoch:4-step:400:loss:0.4849051237106323, acc:0.9375
epoch:5-step:0:loss:0.4020558297634125, acc:1.0
epoch:5-step:100:loss:0.4591820240020752, acc:0.96875
epoch:5-step:200:loss:0.31451544165611267, acc:1.0
epoch:5-step:300:loss:0.3678431510925293, acc:0.984375
epoch:5-step:400:loss:0.3378439247608185, acc:0.984375
epoch:6-step:0:loss:0.5304629802703857, acc:0.9375
epoch:6-step:100:loss:0.3315548896789551, acc:1.0
epoch:6-step:200:loss:0.36920061707496643, acc:0.96875
epoch:6-step:300:loss:0.37146785855293274, acc:1.0
epoch:6-step:400:loss:0.3531312346458435, acc:1.0
epoch:7-step:0:loss:0.3659355640411377, acc:0.96875
epoch:7-step:100:loss:0.2970506250858307, acc:1.0
epoch:7-step:200:loss:0.5152977108955383, acc:0.953125
epoch:7-step:300:loss:0.4044414162635803, acc:0.984375
epoch:7-step:400:loss:0.2705976963043213, acc:1.0
epoch:8-step:0:loss:0.31411826610565186, acc:0.984375
epoch:8-step:100:loss:0.3422676622867584, acc:0.96875
epoch:8-step:200:loss:0.3950996994972229, acc:0.96875
epoch:8-step:300:loss:0.3357853293418884, acc:0.984375
epoch:8-step:400:loss:0.40115973353385925, acc:0.984375
epoch:9-step:0:loss:0.5411560535430908, acc:0.953125
epoch:9-step:100:loss:0.27235427498817444, acc:1.0
epoch:9-step:200:loss:0.27410173416137695, acc:1.0
epoch:9-step:300:loss:0.2925102710723877, acc:0.96875
epoch:9-step:400:loss:0.2640795409679413, acc:1.0
epoch:10-step:0:loss:0.30084824562072754, acc:1.0
epoch:10-step:100:loss:0.27901768684387207, acc:1.0
epoch:10-step:200:loss:0.2954818606376648, acc:1.0
epoch:10-step:300:loss:0.247454434633255, acc:0.984375
epoch:10-step:400:loss:0.35672372579574585, acc:1.0
epoch:11-step:0:loss:0.26435354351997375, acc:1.0
epoch:11-step:100:loss:0.2703622281551361, acc:1.0
epoch:11-step:200:loss:0.5299990773200989, acc:0.953125
epoch:11-step:300:loss:0.23795157670974731, acc:1.0
epoch:11-step:400:loss:0.2920284867286682, acc:0.984375
epoch:12-step:0:loss:0.2721567153930664, acc:0.96875
epoch:12-step:100:loss:0.35027775168418884, acc:1.0
epoch:12-step:200:loss:0.287830114364624, acc:1.0
epoch:12-step:300:loss:0.3720857799053192, acc:0.9375
epoch:12-step:400:loss:0.27490004897117615, acc:0.984375
epoch:13-step:0:loss:0.3159744441509247, acc:0.984375
epoch:13-step:100:loss:0.2862488627433777, acc:1.0
epoch:13-step:200:loss:0.25372081995010376, acc:1.0
epoch:13-step:300:loss:0.37426963448524475, acc:0.96875
epoch:13-step:400:loss:0.3686545491218567, acc:1.0
epoch:14-step:0:loss:0.274233341217041, acc:1.0
epoch:14-step:100:loss:0.3172175884246826, acc:1.0
epoch:14-step:200:loss:0.5602083206176758, acc:0.9375
epoch:14-step:300:loss:0.2863656282424927, acc:1.0
epoch:14-step:400:loss:0.28616100549697876, acc:0.984375
epoch:15-step:0:loss:0.2829575836658478, acc:1.0
epoch:15-step:100:loss:0.3470420241355896, acc:0.984375
epoch:15-step:200:loss:0.27332547307014465, acc:1.0
epoch:15-step:300:loss:0.28411853313446045, acc:1.0
epoch:15-step:400:loss:0.27183854579925537, acc:1.0
epoch:16-step:0:loss:0.27608978748321533, acc:0.984375
epoch:16-step:100:loss:0.2817429304122925, acc:1.0
epoch:16-step:200:loss:0.2714748978614807, acc:0.984375
epoch:16-step:300:loss:0.2615751624107361, acc:1.0
epoch:16-step:400:loss:0.26123568415641785, acc:0.984375
epoch:17-step:0:loss:0.33730071783065796, acc:0.984375
epoch:17-step:100:loss:0.24686148762702942, acc:1.0
epoch:17-step:200:loss:0.3086400330066681, acc:1.0
epoch:17-step:300:loss:0.5279155969619751, acc:0.96875
epoch:17-step:400:loss:0.32528525590896606, acc:1.0
epoch:18-step:0:loss:0.30509132146835327, acc:1.0
epoch:18-step:100:loss:0.23571285605430603, acc:1.0
epoch:18-step:200:loss:0.2900455594062805, acc:0.984375
epoch:18-step:300:loss:0.3105774521827698, acc:0.984375
epoch:18-step:400:loss:0.28706759214401245, acc:1.0
epoch:19-step:0:loss:0.3488197326660156, acc:0.96875
epoch:19-step:100:loss:0.41094356775283813, acc:0.96875
epoch:19-step:200:loss:0.3227531909942627, acc:0.96875
epoch:19-step:300:loss:0.3176570534706116, acc:0.984375
epoch:19-step:400:loss:0.3314107060432434, acc:0.984375
epoch:20-step:0:loss:0.3362692892551422, acc:0.96875
epoch:20-step:100:loss:0.3193368911743164, acc:0.984375
epoch:20-step:200:loss:0.2940109968185425, acc:0.984375
epoch:20-step:300:loss:0.30383074283599854, acc:1.0
epoch:20-step:400:loss:0.2842479646205902, acc:1.0
epoch:21-step:0:loss:0.31892162561416626, acc:1.0
epoch:21-step:100:loss:0.32428157329559326, acc:0.984375
epoch:21-step:200:loss:0.348207950592041, acc:0.984375
epoch:21-step:300:loss:0.3341505527496338, acc:1.0
epoch:21-step:400:loss:0.33450591564178467, acc:0.984375
epoch:22-step:0:loss:0.3641238808631897, acc:1.0
epoch:22-step:100:loss:0.22999759018421173, acc:1.0
epoch:22-step:200:loss:0.22058755159378052, acc:1.0
epoch:22-step:300:loss:0.23714381456375122, acc:1.0
epoch:22-step:400:loss:0.2500191628932953, acc:1.0
epoch:23-step:0:loss:0.22351427376270294, acc:1.0
epoch:23-step:100:loss:0.2509956955909729, acc:1.0
epoch:23-step:200:loss:0.24430502951145172, acc:1.0
epoch:23-step:300:loss:0.3709629774093628, acc:0.96875
epoch:23-step:400:loss:0.2027788609266281, acc:1.0
epoch:24-step:0:loss:0.23092855513095856, acc:1.0
epoch:24-step:100:loss:0.23781482875347137, acc:1.0
epoch:24-step:200:loss:0.2971919775009155, acc:1.0
epoch:24-step:300:loss:0.338306725025177, acc:1.0
epoch:24-step:400:loss:0.3219115734100342, acc:0.984375
epoch:25-step:0:loss:0.24393007159233093, acc:1.0
epoch:25-step:100:loss:0.3434106409549713, acc:0.9375
epoch:25-step:200:loss:0.2983097732067108, acc:1.0
epoch:25-step:300:loss:0.4493078589439392, acc:0.96875
epoch:25-step:400:loss:0.3198167383670807, acc:1.0
epoch:26-step:0:loss:0.24257059395313263, acc:1.0
epoch:26-step:100:loss:0.31731313467025757, acc:1.0
epoch:26-step:200:loss:0.21102392673492432, acc:1.0
epoch:26-step:300:loss:0.30590930581092834, acc:0.984375
epoch:26-step:400:loss:0.2257823497056961, acc:1.0
epoch:27-step:0:loss:0.310319721698761, acc:1.0
epoch:27-step:100:loss:0.30156540870666504, acc:0.96875
epoch:27-step:200:loss:0.2838490903377533, acc:1.0
epoch:27-step:300:loss:0.3473086953163147, acc:0.984375
epoch:27-step:400:loss:0.2559375762939453, acc:0.984375
epoch:28-step:0:loss:0.310485303401947, acc:0.984375
epoch:28-step:100:loss:0.3607978820800781, acc:0.953125
epoch:28-step:200:loss:0.44219809770584106, acc:0.96875
epoch:28-step:300:loss:0.2542918920516968, acc:1.0
epoch:28-step:400:loss:0.2159016877412796, acc:1.0
epoch:29-step:0:loss:0.22000880539417267, acc:1.0
epoch:29-step:100:loss:0.2266199290752411, acc:1.0
epoch:29-step:200:loss:0.26222696900367737, acc:1.0
epoch:29-step:300:loss:0.3106392025947571, acc:0.984375
epoch:29-step:400:loss:0.36904847621917725, acc:1.0
epoch:30-step:0:loss:0.34375429153442383, acc:1.0
epoch:30-step:100:loss:0.32525262236595154, acc:0.96875
epoch:30-step:200:loss:0.22961777448654175, acc:1.0
epoch:30-step:300:loss:0.3222538232803345, acc:0.984375
epoch:30-step:400:loss:0.34517884254455566, acc:1.0
epoch:31-step:0:loss:0.2945629358291626, acc:0.984375
epoch:31-step:100:loss:0.268366277217865, acc:0.984375
epoch:31-step:200:loss:0.18957987427711487, acc:1.0
epoch:31-step:300:loss:0.2590569853782654, acc:0.984375
epoch:31-step:400:loss:0.3257673978805542, acc:0.984375
epoch:32-step:0:loss:0.26132434606552124, acc:1.0
epoch:32-step:100:loss:0.29284968972206116, acc:1.0
epoch:32-step:200:loss:0.2253561168909073, acc:1.0
epoch:32-step:300:loss:0.3098476529121399, acc:0.984375
epoch:32-step:400:loss:0.3402651250362396, acc:1.0
epoch:33-step:0:loss:0.29601138830184937, acc:1.0
epoch:33-step:100:loss:0.21517103910446167, acc:1.0
epoch:33-step:200:loss:0.24986839294433594, acc:1.0
epoch:33-step:300:loss:0.24151185154914856, acc:1.0
epoch:33-step:400:loss:0.2537617087364197, acc:0.984375
epoch:34-step:0:loss:0.2897787094116211, acc:1.0
epoch:34-step:100:loss:0.3102341294288635, acc:1.0
epoch:34-step:200:loss:0.31013330817222595, acc:0.984375
epoch:34-step:300:loss:0.331012487411499, acc:0.96875
epoch:34-step:400:loss:0.2671318054199219, acc:1.0
epoch:35-step:0:loss:0.25242093205451965, acc:1.0
epoch:35-step:100:loss:0.29493749141693115, acc:1.0
epoch:35-step:200:loss:0.34535685181617737, acc:0.984375
epoch:35-step:300:loss:0.3241482079029083, acc:1.0
epoch:35-step:400:loss:0.2128269076347351, acc:1.0
epoch:36-step:0:loss:0.23637321591377258, acc:1.0
epoch:36-step:100:loss:0.30024927854537964, acc:1.0
epoch:36-step:200:loss:0.24246391654014587, acc:1.0
epoch:36-step:300:loss:0.31126612424850464, acc:1.0
epoch:36-step:400:loss:0.3011528253555298, acc:1.0
epoch:37-step:0:loss:0.2384546995162964, acc:1.0
epoch:37-step:100:loss:0.2325681895017624, acc:1.0
epoch:37-step:200:loss:0.37857574224472046, acc:0.96875
epoch:37-step:300:loss:0.28826019167900085, acc:1.0
epoch:37-step:400:loss:0.2752712070941925, acc:1.0
epoch:38-step:0:loss:0.20408588647842407, acc:1.0
epoch:38-step:100:loss:0.29788440465927124, acc:1.0
epoch:38-step:200:loss:0.25764626264572144, acc:1.0
epoch:38-step:300:loss:0.2581232488155365, acc:1.0
epoch:38-step:400:loss:0.3989195227622986, acc:0.984375
epoch:39-step:0:loss:0.3091227114200592, acc:1.0
epoch:39-step:100:loss:0.3755824565887451, acc:0.984375
epoch:39-step:200:loss:0.26778444647789, acc:1.0
epoch:39-step:300:loss:0.30280816555023193, acc:0.984375
epoch:39-step:400:loss:0.3057660460472107, acc:1.0
epoch:40-step:0:loss:0.2725217342376709, acc:1.0
epoch:40-step:100:loss:0.3863896131515503, acc:0.984375
epoch:40-step:200:loss:0.28988099098205566, acc:1.0
epoch:40-step:300:loss:0.22242814302444458, acc:1.0
epoch:40-step:400:loss:0.3383949398994446, acc:1.0
epoch:41-step:0:loss:0.26388099789619446, acc:1.0
epoch:41-step:100:loss:0.2229824662208557, acc:1.0
epoch:41-step:200:loss:0.292219877243042, acc:0.984375
epoch:41-step:300:loss:0.36415165662765503, acc:0.984375
epoch:41-step:400:loss:0.2665848731994629, acc:1.0
epoch:42-step:0:loss:0.3264614939689636, acc:1.0
epoch:42-step:100:loss:0.2160097360610962, acc:1.0
epoch:42-step:200:loss:0.2968500852584839, acc:1.0
epoch:42-step:300:loss:0.26215144991874695, acc:1.0
epoch:42-step:400:loss:0.22494320571422577, acc:1.0
epoch:43-step:0:loss:0.2729398012161255, acc:1.0
epoch:43-step:100:loss:0.2785150706768036, acc:1.0
epoch:43-step:200:loss:0.23410095274448395, acc:1.0
epoch:43-step:300:loss:0.26621657609939575, acc:0.984375
epoch:43-step:400:loss:0.3136154115200043, acc:1.0
epoch:44-step:0:loss:0.25611886382102966, acc:1.0
epoch:44-step:100:loss:0.20415037870407104, acc:1.0
epoch:44-step:200:loss:0.24167172610759735, acc:1.0
epoch:44-step:300:loss:0.283987820148468, acc:1.0
epoch:44-step:400:loss:0.26916706562042236, acc:1.0
epoch:45-step:0:loss:0.2845613956451416, acc:1.0
epoch:45-step:100:loss:0.26392173767089844, acc:1.0
epoch:45-step:200:loss:0.24768559634685516, acc:0.984375
epoch:45-step:300:loss:0.24893818795681, acc:0.984375
epoch:45-step:400:loss:0.4217151999473572, acc:0.96875
epoch:46-step:0:loss:0.23090697824954987, acc:0.984375
epoch:46-step:100:loss:0.18003663420677185, acc:1.0
epoch:46-step:200:loss:0.24941089749336243, acc:1.0
epoch:46-step:300:loss:0.28483161330223083, acc:1.0
epoch:46-step:400:loss:0.2657340466976166, acc:1.0
epoch:47-step:0:loss:0.2650091350078583, acc:1.0
epoch:47-step:100:loss:0.25943630933761597, acc:1.0
epoch:47-step:200:loss:0.27533236145973206, acc:1.0
epoch:47-step:300:loss:0.33831989765167236, acc:0.984375
epoch:47-step:400:loss:0.23092982172966003, acc:1.0
epoch:48-step:0:loss:0.19768010079860687, acc:1.0
epoch:48-step:100:loss:0.2563469707965851, acc:1.0
epoch:48-step:200:loss:0.205824613571167, acc:1.0
epoch:48-step:300:loss:0.25635892152786255, acc:1.0
epoch:48-step:400:loss:0.2535836398601532, acc:1.0
epoch:49-step:0:loss:0.26322126388549805, acc:1.0
epoch:49-step:100:loss:0.2160058319568634, acc:1.0
epoch:49-step:200:loss:0.24737797677516937, acc:1.0
epoch:49-step:300:loss:0.27136537432670593, acc:1.0
epoch:49-step:400:loss:0.2188822627067566, acc:1.0
epoch:50-step:0:loss:0.22991859912872314, acc:1.0
epoch:50-step:100:loss:0.24580472707748413, acc:1.0
epoch:50-step:200:loss:0.25465500354766846, acc:1.0
epoch:50-step:300:loss:0.3008488416671753, acc:1.0
epoch:50-step:400:loss:0.3736513555049896, acc:0.984375
epoch:51-step:0:loss:0.2764037549495697, acc:1.0
epoch:51-step:100:loss:0.23832055926322937, acc:1.0
epoch:51-step:200:loss:0.3055259585380554, acc:1.0
epoch:51-step:300:loss:0.1988280862569809, acc:1.0
epoch:51-step:400:loss:0.3063410520553589, acc:0.984375
epoch:52-step:0:loss:0.2538548409938812, acc:1.0
epoch:52-step:100:loss:0.20880213379859924, acc:1.0
epoch:52-step:200:loss:0.24543236196041107, acc:1.0
epoch:52-step:300:loss:0.2540209889411926, acc:0.984375
epoch:52-step:400:loss:0.3688245415687561, acc:0.984375
epoch:53-step:0:loss:0.2730008363723755, acc:1.0
epoch:53-step:100:loss:0.2570326030254364, acc:1.0
epoch:53-step:200:loss:0.3111269176006317, acc:0.984375
epoch:53-step:300:loss:0.22246724367141724, acc:1.0
epoch:53-step:400:loss:0.21843045949935913, acc:1.0
epoch:54-step:0:loss:0.24925944209098816, acc:1.0
epoch:54-step:100:loss:0.20775897800922394, acc:1.0
epoch:54-step:200:loss:0.3513832986354828, acc:1.0
epoch:54-step:300:loss:0.3000152111053467, acc:1.0
epoch:54-step:400:loss:0.258748859167099, acc:1.0
epoch:55-step:0:loss:0.2483396977186203, acc:1.0
epoch:55-step:100:loss:0.27720218896865845, acc:1.0
epoch:55-step:200:loss:0.21681155264377594, acc:1.0
epoch:55-step:300:loss:0.24716193974018097, acc:1.0
epoch:55-step:400:loss:0.2578742802143097, acc:1.0
epoch:56-step:0:loss:0.2398233860731125, acc:1.0
epoch:56-step:100:loss:0.2771010994911194, acc:1.0
epoch:56-step:200:loss:0.2673088610172272, acc:0.96875
epoch:56-step:300:loss:0.2704741954803467, acc:1.0
epoch:56-step:400:loss:0.2808868885040283, acc:1.0
epoch:57-step:0:loss:0.3065989315509796, acc:1.0
epoch:57-step:100:loss:0.2522042989730835, acc:0.984375
epoch:57-step:200:loss:0.260513573884964, acc:1.0
epoch:57-step:300:loss:0.2332201600074768, acc:1.0
epoch:57-step:400:loss:0.23818844556808472, acc:1.0
epoch:58-step:0:loss:0.2931153476238251, acc:1.0
epoch:58-step:100:loss:0.2822316884994507, acc:1.0
epoch:58-step:200:loss:0.27821218967437744, acc:1.0
epoch:58-step:300:loss:0.2567179501056671, acc:1.0
epoch:58-step:400:loss:0.23723331093788147, acc:1.0
epoch:59-step:0:loss:0.18063001334667206, acc:1.0
epoch:59-step:100:loss:0.2520148754119873, acc:0.984375
epoch:59-step:200:loss:0.2711891829967499, acc:0.984375
epoch:59-step:300:loss:0.23837852478027344, acc:1.0
epoch:59-step:400:loss:0.2211880385875702, acc:1.0
epoch:60-step:0:loss:0.2950493097305298, acc:1.0
epoch:60-step:100:loss:0.2939278781414032, acc:1.0
epoch:60-step:200:loss:0.22991259396076202, acc:1.0
epoch:60-step:300:loss:0.2874300479888916, acc:1.0
epoch:60-step:400:loss:0.22187215089797974, acc:1.0
epoch:61-step:0:loss:0.23016515374183655, acc:1.0
epoch:61-step:100:loss:0.2482266128063202, acc:1.0
epoch:61-step:200:loss:0.24248459935188293, acc:1.0
epoch:61-step:300:loss:0.24449986219406128, acc:1.0
epoch:61-step:400:loss:0.31137168407440186, acc:1.0
epoch:62-step:0:loss:0.2627957761287689, acc:1.0
epoch:62-step:100:loss:0.2321496307849884, acc:1.0
epoch:62-step:200:loss:0.25001847743988037, acc:1.0
epoch:62-step:300:loss:0.2198486477136612, acc:1.0
epoch:62-step:400:loss:0.25825434923171997, acc:1.0
epoch:63-step:0:loss:0.2218083292245865, acc:1.0
epoch:63-step:100:loss:0.2533058524131775, acc:1.0
epoch:63-step:200:loss:0.317036509513855, acc:1.0
epoch:63-step:300:loss:0.27918335795402527, acc:0.984375
epoch:63-step:400:loss:0.324290007352829, acc:0.984375
epoch:64-step:0:loss:0.30735957622528076, acc:1.0
epoch:64-step:100:loss:0.2519816756248474, acc:1.0
epoch:64-step:200:loss:0.26827216148376465, acc:1.0
epoch:64-step:300:loss:0.23170748353004456, acc:1.0
epoch:64-step:400:loss:0.268909752368927, acc:1.0
epoch:65-step:0:loss:0.22210651636123657, acc:1.0
epoch:65-step:100:loss:0.29223525524139404, acc:1.0
epoch:65-step:200:loss:0.28932085633277893, acc:0.984375
epoch:65-step:300:loss:0.39013928174972534, acc:1.0
epoch:65-step:400:loss:0.22963924705982208, acc:1.0
epoch:66-step:0:loss:0.3358096182346344, acc:1.0
epoch:66-step:100:loss:0.25039270520210266, acc:1.0
epoch:66-step:200:loss:0.1932317316532135, acc:1.0
epoch:66-step:300:loss:0.28013551235198975, acc:1.0
epoch:66-step:400:loss:0.2550528049468994, acc:0.984375
epoch:67-step:0:loss:0.20987680554389954, acc:1.0
epoch:67-step:100:loss:0.3133028745651245, acc:1.0
epoch:67-step:200:loss:0.2129879891872406, acc:1.0
epoch:67-step:300:loss:0.25465527176856995, acc:1.0
epoch:67-step:400:loss:0.2804788649082184, acc:1.0
epoch:68-step:0:loss:0.1931174397468567, acc:1.0
epoch:68-step:100:loss:0.22071534395217896, acc:1.0
epoch:68-step:200:loss:0.28408822417259216, acc:1.0
epoch:68-step:300:loss:0.2323785275220871, acc:1.0
epoch:68-step:400:loss:0.23120257258415222, acc:1.0
epoch:69-step:0:loss:0.3071490228176117, acc:1.0
epoch:69-step:100:loss:0.2749641537666321, acc:1.0
epoch:69-step:200:loss:0.31480032205581665, acc:1.0
epoch:69-step:300:loss:0.24446997046470642, acc:1.0
epoch:69-step:400:loss:0.24190063774585724, acc:1.0
epoch:70-step:0:loss:0.2694661319255829, acc:1.0
epoch:70-step:100:loss:0.29563063383102417, acc:1.0
epoch:70-step:200:loss:0.184425950050354, acc:1.0
epoch:70-step:300:loss:0.2968219518661499, acc:0.984375
epoch:70-step:400:loss:0.22127212584018707, acc:1.0
epoch:71-step:0:loss:0.2404927909374237, acc:1.0
epoch:71-step:100:loss:0.23983702063560486, acc:1.0
epoch:71-step:200:loss:0.27070319652557373, acc:1.0
epoch:71-step:300:loss:0.2776453495025635, acc:1.0
epoch:71-step:400:loss:0.20155584812164307, acc:1.0
epoch:72-step:0:loss:0.27350640296936035, acc:1.0
epoch:72-step:100:loss:0.2357451319694519, acc:1.0
epoch:72-step:200:loss:0.2813214659690857, acc:1.0
epoch:72-step:300:loss:0.25063154101371765, acc:1.0
epoch:72-step:400:loss:0.22274819016456604, acc:1.0
epoch:73-step:0:loss:0.2831312417984009, acc:1.0
epoch:73-step:100:loss:0.2348935604095459, acc:1.0
epoch:73-step:200:loss:0.21979132294654846, acc:1.0
epoch:73-step:300:loss:0.22272056341171265, acc:1.0
epoch:73-step:400:loss:0.2787545919418335, acc:1.0
epoch:74-step:0:loss:0.2511402368545532, acc:1.0
epoch:74-step:100:loss:0.28029805421829224, acc:1.0
epoch:74-step:200:loss:0.24730347096920013, acc:1.0
epoch:74-step:300:loss:0.25459060072898865, acc:1.0
epoch:74-step:400:loss:0.2750202417373657, acc:1.0
epoch:75-step:0:loss:0.2478216588497162, acc:1.0
epoch:75-step:100:loss:0.27183979749679565, acc:1.0
epoch:75-step:200:loss:0.27488017082214355, acc:1.0
epoch:75-step:300:loss:0.21423284709453583, acc:1.0
epoch:75-step:400:loss:0.3217814862728119, acc:1.0
epoch:76-step:0:loss:0.33788350224494934, acc:0.984375
epoch:76-step:100:loss:0.32613056898117065, acc:1.0
epoch:76-step:200:loss:0.3056522011756897, acc:0.984375
epoch:76-step:300:loss:0.2422901839017868, acc:1.0
epoch:76-step:400:loss:0.29351985454559326, acc:1.0
epoch:77-step:0:loss:0.29796308279037476, acc:1.0
epoch:77-step:100:loss:0.2825579047203064, acc:1.0
epoch:77-step:200:loss:0.23829609155654907, acc:1.0
epoch:77-step:300:loss:0.21841777861118317, acc:1.0
epoch:77-step:400:loss:0.2012813687324524, acc:1.0
epoch:78-step:0:loss:0.21836619079113007, acc:1.0
epoch:78-step:100:loss:0.1952076107263565, acc:1.0
epoch:78-step:200:loss:0.19509263336658478, acc:1.0
epoch:78-step:300:loss:0.2168663740158081, acc:1.0
epoch:78-step:400:loss:0.23619067668914795, acc:1.0
epoch:79-step:0:loss:0.2971745729446411, acc:0.984375
epoch:79-step:100:loss:0.25343602895736694, acc:1.0
epoch:79-step:200:loss:0.24152258038520813, acc:1.0
epoch:79-step:300:loss:0.2262602001428604, acc:1.0
epoch:79-step:400:loss:0.2354142963886261, acc:1.0
epoch:80-step:0:loss:0.2739824950695038, acc:1.0
epoch:80-step:100:loss:0.2727130651473999, acc:1.0
epoch:80-step:200:loss:0.280419260263443, acc:0.984375
epoch:80-step:300:loss:0.24652940034866333, acc:1.0
epoch:80-step:400:loss:0.23768457770347595, acc:1.0
epoch:81-step:0:loss:0.263195276260376, acc:1.0
epoch:81-step:100:loss:0.20266863703727722, acc:1.0
epoch:81-step:200:loss:0.2290525883436203, acc:1.0
epoch:81-step:300:loss:0.21320360898971558, acc:1.0
epoch:81-step:400:loss:0.2729652523994446, acc:1.0
epoch:82-step:0:loss:0.22580716013908386, acc:1.0
epoch:82-step:100:loss:0.26880350708961487, acc:1.0
epoch:82-step:200:loss:0.25758594274520874, acc:1.0
epoch:82-step:300:loss:0.24604350328445435, acc:1.0
epoch:82-step:400:loss:0.22075045108795166, acc:1.0
epoch:83-step:0:loss:0.26757997274398804, acc:1.0
epoch:83-step:100:loss:0.3073601722717285, acc:1.0
epoch:83-step:200:loss:0.19666513800621033, acc:1.0
epoch:83-step:300:loss:0.2426549196243286, acc:1.0
epoch:83-step:400:loss:0.35791486501693726, acc:1.0
epoch:84-step:0:loss:0.3117595613002777, acc:0.984375
epoch:84-step:100:loss:0.452436625957489, acc:0.984375
epoch:84-step:200:loss:0.21810376644134521, acc:1.0
epoch:84-step:300:loss:0.2328830063343048, acc:1.0
epoch:84-step:400:loss:0.312905877828598, acc:1.0
epoch:85-step:0:loss:0.20200097560882568, acc:1.0
epoch:85-step:100:loss:0.25243592262268066, acc:1.0
epoch:85-step:200:loss:0.21348997950553894, acc:1.0
epoch:85-step:300:loss:0.3070557117462158, acc:1.0
epoch:85-step:400:loss:0.23331549763679504, acc:1.0
epoch:86-step:0:loss:0.1777869164943695, acc:1.0
epoch:86-step:100:loss:0.2686985731124878, acc:1.0
epoch:86-step:200:loss:0.3434707224369049, acc:1.0
epoch:86-step:300:loss:0.24046698212623596, acc:1.0
epoch:86-step:400:loss:0.181772381067276, acc:1.0
epoch:87-step:0:loss:0.2542106509208679, acc:1.0
epoch:87-step:100:loss:0.23375113308429718, acc:1.0
epoch:87-step:200:loss:0.18596100807189941, acc:1.0
epoch:87-step:300:loss:0.2521262764930725, acc:1.0
epoch:87-step:400:loss:0.2554885745048523, acc:1.0
epoch:88-step:0:loss:0.2705616056919098, acc:1.0
epoch:88-step:100:loss:0.23064908385276794, acc:1.0
epoch:88-step:200:loss:0.19668006896972656, acc:1.0
epoch:88-step:300:loss:0.25791606307029724, acc:1.0
epoch:88-step:400:loss:0.24029412865638733, acc:1.0
epoch:89-step:0:loss:0.313434898853302, acc:1.0
epoch:89-step:100:loss:0.18794161081314087, acc:1.0
epoch:89-step:200:loss:0.2724647521972656, acc:1.0
epoch:89-step:300:loss:0.30503562092781067, acc:1.0
epoch:89-step:400:loss:0.3291110396385193, acc:1.0
epoch:90-step:0:loss:0.3018816113471985, acc:1.0
epoch:90-step:100:loss:0.28543660044670105, acc:1.0
epoch:90-step:200:loss:0.2785513997077942, acc:1.0
epoch:90-step:300:loss:0.28076493740081787, acc:1.0
epoch:90-step:400:loss:0.26973938941955566, acc:1.0
epoch:91-step:0:loss:0.31419074535369873, acc:0.984375
epoch:91-step:100:loss:0.3003067970275879, acc:1.0
epoch:91-step:200:loss:0.23939818143844604, acc:1.0
epoch:91-step:300:loss:0.31359565258026123, acc:1.0
epoch:91-step:400:loss:0.2242840975522995, acc:1.0
epoch:92-step:0:loss:0.3053385615348816, acc:1.0
epoch:92-step:100:loss:0.23097631335258484, acc:1.0
epoch:92-step:200:loss:0.208337664604187, acc:1.0
epoch:92-step:300:loss:0.18411566317081451, acc:1.0
epoch:92-step:400:loss:0.291424423456192, acc:1.0
epoch:93-step:0:loss:0.2769576907157898, acc:1.0
epoch:93-step:100:loss:0.2532631754875183, acc:1.0
epoch:93-step:200:loss:0.27226340770721436, acc:1.0
epoch:93-step:300:loss:0.29359397292137146, acc:1.0
epoch:93-step:400:loss:0.24184046685695648, acc:1.0
epoch:94-step:0:loss:0.3457660675048828, acc:1.0
epoch:94-step:100:loss:0.23735174536705017, acc:1.0
epoch:94-step:200:loss:0.29462966322898865, acc:1.0
epoch:94-step:300:loss:0.4279404580593109, acc:0.96875
epoch:94-step:400:loss:0.22506490349769592, acc:1.0
epoch:95-step:0:loss:0.22936178743839264, acc:1.0
epoch:95-step:100:loss:0.26659247279167175, acc:0.984375
epoch:95-step:200:loss:0.24212419986724854, acc:1.0
epoch:95-step:300:loss:0.2572339177131653, acc:1.0
epoch:95-step:400:loss:0.31366798281669617, acc:1.0
epoch:96-step:0:loss:0.30791473388671875, acc:1.0
epoch:96-step:100:loss:0.3134751319885254, acc:1.0
epoch:96-step:200:loss:0.2963234484195709, acc:1.0
epoch:96-step:300:loss:0.2531590461730957, acc:1.0
epoch:96-step:400:loss:0.20157894492149353, acc:1.0
epoch:97-step:0:loss:0.27568501234054565, acc:1.0
epoch:97-step:100:loss:0.22656401991844177, acc:1.0
epoch:97-step:200:loss:0.23204125463962555, acc:1.0
epoch:97-step:300:loss:0.2444327026605606, acc:1.0
epoch:97-step:400:loss:0.37833595275878906, acc:1.0
epoch:98-step:0:loss:0.25936710834503174, acc:1.0
epoch:98-step:100:loss:0.23311419785022736, acc:1.0
epoch:98-step:200:loss:0.2209533154964447, acc:1.0
epoch:98-step:300:loss:0.24138382077217102, acc:1.0
epoch:98-step:400:loss:0.2540064752101898, acc:1.0
epoch:99-step:0:loss:0.3674023747444153, acc:1.0
epoch:99-step:100:loss:0.28614455461502075, acc:1.0
epoch:99-step:200:loss:0.28182294964790344, acc:0.984375
epoch:99-step:300:loss:0.20516589283943176, acc:1.0
epoch:99-step:400:loss:0.33519330620765686, acc:1.0
epoch:100-step:0:loss:0.25340035557746887, acc:1.0
epoch:100-step:100:loss:0.23698148131370544, acc:1.0
epoch:100-step:200:loss:0.24195867776870728, acc:1.0
epoch:100-step:300:loss:0.27721095085144043, acc:1.0
epoch:100-step:400:loss:0.2863903045654297, acc:1.0
epoch:101-step:0:loss:0.23972603678703308, acc:0.984375
epoch:101-step:100:loss:0.2505950629711151, acc:1.0
epoch:101-step:200:loss:0.25054097175598145, acc:1.0
epoch:101-step:300:loss:0.24011583626270294, acc:1.0
epoch:101-step:400:loss:0.3120497465133667, acc:1.0
epoch:102-step:0:loss:0.2973298728466034, acc:1.0
epoch:102-step:100:loss:0.2386799156665802, acc:1.0
epoch:102-step:200:loss:0.2610582113265991, acc:1.0
epoch:102-step:300:loss:0.2077862024307251, acc:1.0
epoch:102-step:400:loss:0.21920126676559448, acc:1.0
epoch:103-step:0:loss:0.3083780109882355, acc:1.0
epoch:103-step:100:loss:0.3054988980293274, acc:1.0
epoch:103-step:200:loss:0.3367774188518524, acc:1.0
epoch:103-step:300:loss:0.2650822401046753, acc:1.0
epoch:103-step:400:loss:0.2804808020591736, acc:1.0
epoch:104-step:0:loss:0.2708079218864441, acc:1.0
epoch:104-step:100:loss:0.2931290864944458, acc:1.0
epoch:104-step:200:loss:0.27268707752227783, acc:1.0
epoch:104-step:300:loss:0.21196313202381134, acc:1.0
epoch:104-step:400:loss:0.2307625114917755, acc:1.0
epoch:105-step:0:loss:0.1831187605857849, acc:1.0
epoch:105-step:100:loss:0.3680713474750519, acc:0.984375
epoch:105-step:200:loss:0.20661383867263794, acc:1.0
epoch:105-step:300:loss:0.2799603044986725, acc:1.0
epoch:105-step:400:loss:0.21406430006027222, acc:1.0
epoch:106-step:0:loss:0.34610432386398315, acc:0.984375
epoch:106-step:100:loss:0.26803159713745117, acc:1.0
epoch:106-step:200:loss:0.2284884750843048, acc:1.0
epoch:106-step:300:loss:0.19629932940006256, acc:1.0
epoch:106-step:400:loss:0.2265171855688095, acc:1.0
epoch:107-step:0:loss:0.22196093201637268, acc:1.0
epoch:107-step:100:loss:0.2755454182624817, acc:1.0
epoch:107-step:200:loss:0.29444801807403564, acc:1.0
epoch:107-step:300:loss:0.28144508600234985, acc:1.0
epoch:107-step:400:loss:0.19527029991149902, acc:1.0
epoch:108-step:0:loss:0.21416544914245605, acc:1.0
epoch:108-step:100:loss:0.2603655457496643, acc:1.0
epoch:108-step:200:loss:0.24491670727729797, acc:1.0
epoch:108-step:300:loss:0.21874311566352844, acc:1.0
epoch:108-step:400:loss:0.29604628682136536, acc:1.0
epoch:109-step:0:loss:0.2833636403083801, acc:0.984375
epoch:109-step:100:loss:0.20870226621627808, acc:1.0
epoch:109-step:200:loss:0.25153207778930664, acc:1.0
epoch:109-step:300:loss:0.22958049178123474, acc:1.0
epoch:109-step:400:loss:0.2615359425544739, acc:1.0
epoch:110-step:0:loss:0.2203066647052765, acc:1.0
epoch:110-step:100:loss:0.2697179615497589, acc:1.0
epoch:110-step:200:loss:0.25882816314697266, acc:1.0
epoch:110-step:300:loss:0.2297130525112152, acc:1.0
epoch:110-step:400:loss:0.21474727988243103, acc:1.0
epoch:111-step:0:loss:0.35342127084732056, acc:0.984375
epoch:111-step:100:loss:0.22377917170524597, acc:1.0
epoch:111-step:200:loss:0.2413293719291687, acc:1.0
epoch:111-step:300:loss:0.2386646866798401, acc:1.0
epoch:111-step:400:loss:0.21403571963310242, acc:1.0
epoch:112-step:0:loss:0.2452630251646042, acc:1.0
epoch:112-step:100:loss:0.25466063618659973, acc:1.0
epoch:112-step:200:loss:0.22512270510196686, acc:1.0
epoch:112-step:300:loss:0.24121493101119995, acc:1.0
epoch:112-step:400:loss:0.28433775901794434, acc:1.0
epoch:113-step:0:loss:0.2800089716911316, acc:1.0
epoch:113-step:100:loss:0.34875190258026123, acc:0.984375
epoch:113-step:200:loss:0.2557535171508789, acc:1.0
epoch:113-step:300:loss:0.32884925603866577, acc:0.984375
epoch:113-step:400:loss:0.27632561326026917, acc:1.0
epoch:114-step:0:loss:0.2542975842952728, acc:1.0
epoch:114-step:100:loss:0.21471059322357178, acc:1.0
epoch:114-step:200:loss:0.21108442544937134, acc:1.0
epoch:114-step:300:loss:0.2778623700141907, acc:1.0
epoch:114-step:400:loss:0.24043229222297668, acc:1.0
epoch:115-step:0:loss:0.18812346458435059, acc:1.0
epoch:115-step:100:loss:0.3128037750720978, acc:0.984375
epoch:115-step:200:loss:0.2800995111465454, acc:1.0
epoch:115-step:300:loss:0.18469873070716858, acc:1.0
epoch:115-step:400:loss:0.236960306763649, acc:1.0
epoch:116-step:0:loss:0.20279020071029663, acc:1.0
epoch:116-step:100:loss:0.28249678015708923, acc:1.0
epoch:116-step:200:loss:0.2587033808231354, acc:1.0
epoch:116-step:300:loss:0.26299306750297546, acc:1.0
epoch:116-step:400:loss:0.24322935938835144, acc:1.0
epoch:117-step:0:loss:0.2145509421825409, acc:1.0
epoch:117-step:100:loss:0.20360037684440613, acc:1.0
epoch:117-step:200:loss:0.3046322762966156, acc:1.0
epoch:117-step:300:loss:0.21184012293815613, acc:1.0
epoch:117-step:400:loss:0.23838239908218384, acc:1.0
epoch:118-step:0:loss:0.2212098240852356, acc:1.0
epoch:118-step:100:loss:0.2972424328327179, acc:1.0
epoch:118-step:200:loss:0.2005123645067215, acc:1.0
epoch:118-step:300:loss:0.17151641845703125, acc:1.0
epoch:118-step:400:loss:0.3100939095020294, acc:0.984375
epoch:119-step:0:loss:0.2947295010089874, acc:0.984375
epoch:119-step:100:loss:0.2579219937324524, acc:1.0
epoch:119-step:200:loss:0.19348928332328796, acc:1.0
epoch:119-step:300:loss:0.19256307184696198, acc:1.0
epoch:119-step:400:loss:0.23121073842048645, acc:1.0
epoch:120-step:0:loss:0.28196680545806885, acc:1.0
epoch:120-step:100:loss:0.2473641186952591, acc:1.0
epoch:120-step:200:loss:0.2619691491127014, acc:1.0
epoch:120-step:300:loss:0.27502936124801636, acc:1.0
epoch:120-step:400:loss:0.23194509744644165, acc:1.0
epoch:121-step:0:loss:0.22598926723003387, acc:1.0
epoch:121-step:100:loss:0.2606201767921448, acc:1.0
epoch:121-step:200:loss:0.19998440146446228, acc:1.0
epoch:121-step:300:loss:0.24028411507606506, acc:1.0
epoch:121-step:400:loss:0.23112304508686066, acc:1.0
epoch:122-step:0:loss:0.2792567014694214, acc:1.0
epoch:122-step:100:loss:0.27872908115386963, acc:1.0
epoch:122-step:200:loss:0.23807393014431, acc:1.0
epoch:122-step:300:loss:0.27277642488479614, acc:1.0
epoch:122-step:400:loss:0.2937144637107849, acc:1.0
epoch:123-step:0:loss:0.22382979094982147, acc:1.0
epoch:123-step:100:loss:0.2532101273536682, acc:1.0
epoch:123-step:200:loss:0.32568424940109253, acc:1.0
epoch:123-step:300:loss:0.209745392203331, acc:1.0
epoch:123-step:400:loss:0.23279497027397156, acc:1.0
epoch:124-step:0:loss:0.27756601572036743, acc:1.0
epoch:124-step:100:loss:0.27633774280548096, acc:1.0
epoch:124-step:200:loss:0.20583002269268036, acc:1.0
epoch:124-step:300:loss:0.25637978315353394, acc:1.0
epoch:124-step:400:loss:0.27393120527267456, acc:1.0
epoch:125-step:0:loss:0.25432461500167847, acc:1.0
epoch:125-step:100:loss:0.2352776676416397, acc:1.0
epoch:125-step:200:loss:0.2570347785949707, acc:1.0
epoch:125-step:300:loss:0.22562120854854584, acc:1.0
epoch:125-step:400:loss:0.23363931477069855, acc:1.0
epoch:126-step:0:loss:0.27399730682373047, acc:1.0
epoch:126-step:100:loss:0.27591586112976074, acc:1.0
epoch:126-step:200:loss:0.26261040568351746, acc:1.0
epoch:126-step:300:loss:0.2749003767967224, acc:1.0
epoch:126-step:400:loss:0.22698360681533813, acc:1.0
epoch:127-step:0:loss:0.24786415696144104, acc:1.0
epoch:127-step:100:loss:0.2597271800041199, acc:1.0
epoch:127-step:200:loss:0.33332282304763794, acc:1.0
epoch:127-step:300:loss:0.22203348577022552, acc:1.0
epoch:127-step:400:loss:0.2543387711048126, acc:1.0
epoch:128-step:0:loss:0.2573715150356293, acc:1.0
epoch:128-step:100:loss:0.24471011757850647, acc:1.0
epoch:128-step:200:loss:0.23633809387683868, acc:1.0
epoch:128-step:300:loss:0.24518217146396637, acc:1.0
epoch:128-step:400:loss:0.2316323220729828, acc:1.0
epoch:129-step:0:loss:0.331983745098114, acc:1.0
epoch:129-step:100:loss:0.2159976065158844, acc:1.0
epoch:129-step:200:loss:0.18552523851394653, acc:1.0
epoch:129-step:300:loss:0.26181185245513916, acc:1.0
epoch:129-step:400:loss:0.261893093585968, acc:1.0
epoch:130-step:0:loss:0.2901924252510071, acc:1.0
epoch:130-step:100:loss:0.24650228023529053, acc:1.0
epoch:130-step:200:loss:0.2851848304271698, acc:1.0
epoch:130-step:300:loss:0.24852396547794342, acc:1.0
epoch:130-step:400:loss:0.2962315082550049, acc:0.984375
epoch:131-step:0:loss:0.2592543363571167, acc:1.0
epoch:131-step:100:loss:0.21656343340873718, acc:1.0
epoch:131-step:200:loss:0.3143486976623535, acc:1.0
epoch:131-step:300:loss:0.26109007000923157, acc:1.0
epoch:131-step:400:loss:0.19610407948493958, acc:1.0
epoch:132-step:0:loss:0.2774367332458496, acc:1.0
epoch:132-step:100:loss:0.23215588927268982, acc:1.0
epoch:132-step:200:loss:0.3253863453865051, acc:1.0
epoch:132-step:300:loss:0.24314036965370178, acc:1.0
epoch:132-step:400:loss:0.23053622245788574, acc:1.0
epoch:133-step:0:loss:0.2263263612985611, acc:1.0
epoch:133-step:100:loss:0.28268536925315857, acc:1.0
epoch:133-step:200:loss:0.28730130195617676, acc:1.0
epoch:133-step:300:loss:0.20562635362148285, acc:1.0
epoch:133-step:400:loss:0.26285630464553833, acc:1.0
epoch:134-step:0:loss:0.2514491677284241, acc:1.0
epoch:134-step:100:loss:0.27304089069366455, acc:1.0
epoch:134-step:200:loss:0.19049054384231567, acc:1.0
epoch:134-step:300:loss:0.31600385904312134, acc:1.0
epoch:134-step:400:loss:0.22875985503196716, acc:1.0
epoch:135-step:0:loss:0.258131742477417, acc:1.0
epoch:135-step:100:loss:0.31665170192718506, acc:1.0
epoch:135-step:200:loss:0.26567375659942627, acc:1.0
epoch:135-step:300:loss:0.29007190465927124, acc:0.96875
epoch:135-step:400:loss:0.28679606318473816, acc:1.0
epoch:136-step:0:loss:0.24716916680335999, acc:1.0
epoch:136-step:100:loss:0.22910618782043457, acc:1.0
epoch:136-step:200:loss:0.29367154836654663, acc:1.0
epoch:136-step:300:loss:0.2531062364578247, acc:1.0
epoch:136-step:400:loss:0.23194877803325653, acc:1.0
epoch:137-step:0:loss:0.23134931921958923, acc:1.0
epoch:137-step:100:loss:0.2621861696243286, acc:1.0
epoch:137-step:200:loss:0.25343120098114014, acc:1.0
epoch:137-step:300:loss:0.27887779474258423, acc:1.0
epoch:137-step:400:loss:0.26088953018188477, acc:1.0
epoch:138-step:0:loss:0.22714096307754517, acc:1.0
epoch:138-step:100:loss:0.2352410852909088, acc:1.0
epoch:138-step:200:loss:0.24126139283180237, acc:1.0
epoch:138-step:300:loss:0.26471155881881714, acc:1.0
epoch:138-step:400:loss:0.2538025975227356, acc:1.0
epoch:139-step:0:loss:0.25414958596229553, acc:1.0
epoch:139-step:100:loss:0.33478444814682007, acc:1.0
epoch:139-step:200:loss:0.2927688658237457, acc:1.0
epoch:139-step:300:loss:0.2411057949066162, acc:0.984375
epoch:139-step:400:loss:0.264300137758255, acc:1.0
epoch:140-step:0:loss:0.21501539647579193, acc:0.984375
epoch:140-step:100:loss:0.1968938410282135, acc:1.0
epoch:140-step:200:loss:0.26543259620666504, acc:1.0
epoch:140-step:300:loss:0.32519495487213135, acc:1.0
epoch:140-step:400:loss:0.2588993310928345, acc:1.0
epoch:141-step:0:loss:0.3062745928764343, acc:1.0
epoch:141-step:100:loss:0.21845604479312897, acc:1.0
epoch:141-step:200:loss:0.24673092365264893, acc:1.0
epoch:141-step:300:loss:0.3771938383579254, acc:0.96875
epoch:141-step:400:loss:0.2551853358745575, acc:1.0
epoch:142-step:0:loss:0.22027581930160522, acc:1.0
epoch:142-step:100:loss:0.23308974504470825, acc:1.0
epoch:142-step:200:loss:0.30642399191856384, acc:1.0
epoch:142-step:300:loss:0.2252442091703415, acc:1.0
epoch:142-step:400:loss:0.2663138210773468, acc:1.0
epoch:143-step:0:loss:0.2555733323097229, acc:1.0
epoch:143-step:100:loss:0.19377517700195312, acc:1.0
epoch:143-step:200:loss:0.20148834586143494, acc:1.0
epoch:143-step:300:loss:0.3202188014984131, acc:0.984375
epoch:143-step:400:loss:0.23447240889072418, acc:1.0
epoch:144-step:0:loss:0.29875802993774414, acc:1.0
epoch:144-step:100:loss:0.2501412630081177, acc:1.0
epoch:144-step:200:loss:0.1734803318977356, acc:1.0
epoch:144-step:300:loss:0.2538180351257324, acc:1.0
epoch:144-step:400:loss:0.2257518470287323, acc:1.0
epoch:145-step:0:loss:0.36605316400527954, acc:0.984375
epoch:145-step:100:loss:0.3298322558403015, acc:1.0
epoch:145-step:200:loss:0.22700443863868713, acc:1.0
epoch:145-step:300:loss:0.2019539624452591, acc:1.0
epoch:145-step:400:loss:0.2251279354095459, acc:1.0
epoch:146-step:0:loss:0.23692752420902252, acc:1.0
epoch:146-step:100:loss:0.3762192726135254, acc:0.984375
epoch:146-step:200:loss:0.2119443118572235, acc:1.0
epoch:146-step:300:loss:0.3280588686466217, acc:1.0
epoch:146-step:400:loss:0.25699523091316223, acc:1.0
epoch:147-step:0:loss:0.22407567501068115, acc:1.0
epoch:147-step:100:loss:0.2696402072906494, acc:1.0
epoch:147-step:200:loss:0.228974848985672, acc:1.0
epoch:147-step:300:loss:0.2724018692970276, acc:1.0
epoch:147-step:400:loss:0.267915815114975, acc:1.0
epoch:148-step:0:loss:0.21825025975704193, acc:1.0
epoch:148-step:100:loss:0.22996506094932556, acc:1.0
epoch:148-step:200:loss:0.23565587401390076, acc:1.0
epoch:148-step:300:loss:0.21900394558906555, acc:1.0
epoch:148-step:400:loss:0.1976684033870697, acc:1.0
epoch:149-step:0:loss:0.30085939168930054, acc:1.0
epoch:149-step:100:loss:0.2370867133140564, acc:1.0
epoch:149-step:200:loss:0.2761610746383667, acc:1.0
epoch:149-step:300:loss:0.2858896255493164, acc:1.0
epoch:149-step:400:loss:0.2894901931285858, acc:1.0
epoch:150-step:0:loss:0.249230295419693, acc:1.0
epoch:150-step:100:loss:0.2390851229429245, acc:1.0
epoch:150-step:200:loss:0.22586607933044434, acc:1.0
epoch:150-step:300:loss:0.29624009132385254, acc:1.0
epoch:150-step:400:loss:0.25747668743133545, acc:1.0
epoch:151-step:0:loss:0.255716472864151, acc:1.0
epoch:151-step:100:loss:0.30697453022003174, acc:1.0
epoch:151-step:200:loss:0.39976629614830017, acc:0.96875
epoch:151-step:300:loss:0.2514805793762207, acc:1.0
epoch:151-step:400:loss:0.20687197148799896, acc:1.0
epoch:152-step:0:loss:0.322456032037735, acc:0.984375
epoch:152-step:100:loss:0.19627824425697327, acc:1.0
epoch:152-step:200:loss:0.2373356968164444, acc:1.0
epoch:152-step:300:loss:0.2389371693134308, acc:1.0
epoch:152-step:400:loss:0.24121816456317902, acc:1.0
epoch:153-step:0:loss:0.2716023325920105, acc:0.96875
epoch:153-step:100:loss:0.24055132269859314, acc:1.0
epoch:153-step:200:loss:0.24413248896598816, acc:1.0
epoch:153-step:300:loss:0.33391159772872925, acc:1.0
epoch:153-step:400:loss:0.20598742365837097, acc:1.0
epoch:154-step:0:loss:0.26396334171295166, acc:1.0
epoch:154-step:100:loss:0.208112895488739, acc:1.0
epoch:154-step:200:loss:0.24144206941127777, acc:1.0
epoch:154-step:300:loss:0.224883034825325, acc:1.0
epoch:154-step:400:loss:0.22005727887153625, acc:1.0
epoch:155-step:0:loss:0.23822659254074097, acc:1.0
epoch:155-step:100:loss:0.25620734691619873, acc:1.0
epoch:155-step:200:loss:0.2270633578300476, acc:1.0
epoch:155-step:300:loss:0.21923987567424774, acc:1.0
epoch:155-step:400:loss:0.2241690754890442, acc:1.0
epoch:156-step:0:loss:0.28501105308532715, acc:1.0
epoch:156-step:100:loss:0.2990204691886902, acc:0.984375
epoch:156-step:200:loss:0.24668443202972412, acc:1.0
epoch:156-step:300:loss:0.20897695422172546, acc:1.0
epoch:156-step:400:loss:0.2941798269748688, acc:1.0
epoch:157-step:0:loss:0.2012583315372467, acc:1.0
epoch:157-step:100:loss:0.2350538820028305, acc:1.0
epoch:157-step:200:loss:0.2828391492366791, acc:1.0
epoch:157-step:300:loss:0.2654150128364563, acc:1.0
epoch:157-step:400:loss:0.269229918718338, acc:0.984375
epoch:158-step:0:loss:0.35553011298179626, acc:1.0
epoch:158-step:100:loss:0.2173335701227188, acc:1.0
epoch:158-step:200:loss:0.22292238473892212, acc:1.0
epoch:158-step:300:loss:0.26957976818084717, acc:1.0
epoch:158-step:400:loss:0.24008402228355408, acc:1.0
epoch:159-step:0:loss:0.3016185760498047, acc:1.0
epoch:159-step:100:loss:0.28618139028549194, acc:1.0
epoch:159-step:200:loss:0.2568323016166687, acc:1.0
epoch:159-step:300:loss:0.23661237955093384, acc:1.0
epoch:159-step:400:loss:0.2309078574180603, acc:1.0
epoch:160-step:0:loss:0.2418273687362671, acc:1.0
epoch:160-step:100:loss:0.24145904183387756, acc:1.0
epoch:160-step:200:loss:0.2650478184223175, acc:1.0
epoch:160-step:300:loss:0.24566650390625, acc:1.0
epoch:160-step:400:loss:0.2564401924610138, acc:1.0
epoch:161-step:0:loss:0.22775372862815857, acc:1.0
epoch:161-step:100:loss:0.2669913172721863, acc:1.0
epoch:161-step:200:loss:0.29399606585502625, acc:1.0
epoch:161-step:300:loss:0.26515910029411316, acc:1.0
epoch:161-step:400:loss:0.2899004817008972, acc:1.0
epoch:162-step:0:loss:0.2592678964138031, acc:1.0
epoch:162-step:100:loss:0.21615077555179596, acc:1.0
epoch:162-step:200:loss:0.22538131475448608, acc:1.0
epoch:162-step:300:loss:0.27325114607810974, acc:1.0
epoch:162-step:400:loss:0.2373153567314148, acc:1.0
epoch:163-step:0:loss:0.3250482678413391, acc:1.0
epoch:163-step:100:loss:0.184637188911438, acc:1.0
epoch:163-step:200:loss:0.23735664784908295, acc:1.0
epoch:163-step:300:loss:0.3328459858894348, acc:0.96875
epoch:163-step:400:loss:0.25338175892829895, acc:1.0
epoch:164-step:0:loss:0.2814006805419922, acc:1.0
epoch:164-step:100:loss:0.2933756411075592, acc:1.0
epoch:164-step:200:loss:0.288236528635025, acc:1.0
epoch:164-step:300:loss:0.2754834294319153, acc:1.0
epoch:164-step:400:loss:0.296603262424469, acc:1.0
epoch:165-step:0:loss:0.24811245501041412, acc:0.984375
epoch:165-step:100:loss:0.25327759981155396, acc:1.0
epoch:165-step:200:loss:0.22660338878631592, acc:1.0
epoch:165-step:300:loss:0.28179651498794556, acc:1.0
epoch:165-step:400:loss:0.22590693831443787, acc:1.0
epoch:166-step:0:loss:0.21455511450767517, acc:1.0
epoch:166-step:100:loss:0.2578900456428528, acc:1.0
epoch:166-step:200:loss:0.2278795689344406, acc:1.0
epoch:166-step:300:loss:0.26129764318466187, acc:1.0
epoch:166-step:400:loss:0.28328800201416016, acc:1.0
epoch:167-step:0:loss:0.23324471712112427, acc:1.0
epoch:167-step:100:loss:0.26392367482185364, acc:1.0
epoch:167-step:200:loss:0.1989179104566574, acc:1.0
epoch:167-step:300:loss:0.261949360370636, acc:1.0
epoch:167-step:400:loss:0.2382005751132965, acc:1.0
epoch:168-step:0:loss:0.23040172457695007, acc:1.0
epoch:168-step:100:loss:0.24950671195983887, acc:1.0
epoch:168-step:200:loss:0.3074156641960144, acc:1.0
epoch:168-step:300:loss:0.2988576889038086, acc:1.0
epoch:168-step:400:loss:0.19305798411369324, acc:1.0
epoch:169-step:0:loss:0.25212907791137695, acc:1.0
epoch:169-step:100:loss:0.2334449142217636, acc:1.0
epoch:169-step:200:loss:0.2047998607158661, acc:1.0
epoch:169-step:300:loss:0.28756555914878845, acc:1.0
epoch:169-step:400:loss:0.22605040669441223, acc:1.0
epoch:170-step:0:loss:0.2211194485425949, acc:1.0
epoch:170-step:100:loss:0.22767683863639832, acc:1.0
epoch:170-step:200:loss:0.2750675678253174, acc:1.0
epoch:170-step:300:loss:0.2781563401222229, acc:1.0
epoch:170-step:400:loss:0.29203954339027405, acc:1.0
epoch:171-step:0:loss:0.24804499745368958, acc:1.0
epoch:171-step:100:loss:0.2949185073375702, acc:1.0
epoch:171-step:200:loss:0.2658044695854187, acc:1.0
epoch:171-step:300:loss:0.22533895075321198, acc:1.0
epoch:171-step:400:loss:0.28536975383758545, acc:1.0
epoch:172-step:0:loss:0.22494584321975708, acc:1.0
epoch:172-step:100:loss:0.18964439630508423, acc:1.0
epoch:172-step:200:loss:0.23358029127120972, acc:1.0
epoch:172-step:300:loss:0.2541634440422058, acc:1.0
epoch:172-step:400:loss:0.24410668015480042, acc:1.0
epoch:173-step:0:loss:0.3586099147796631, acc:1.0
epoch:173-step:100:loss:0.36827608942985535, acc:0.984375
epoch:173-step:200:loss:0.18969184160232544, acc:1.0
epoch:173-step:300:loss:0.24876180291175842, acc:1.0
epoch:173-step:400:loss:0.2542993426322937, acc:1.0
epoch:174-step:0:loss:0.29527515172958374, acc:1.0
epoch:174-step:100:loss:0.1828228235244751, acc:1.0
epoch:174-step:200:loss:0.3235582113265991, acc:1.0
epoch:174-step:300:loss:0.23434746265411377, acc:1.0
epoch:174-step:400:loss:0.3592504560947418, acc:1.0
epoch:175-step:0:loss:0.22668802738189697, acc:1.0
epoch:175-step:100:loss:0.28087249398231506, acc:1.0
epoch:175-step:200:loss:0.30048513412475586, acc:1.0
epoch:175-step:300:loss:0.21639087796211243, acc:1.0
epoch:175-step:400:loss:0.23268085718154907, acc:1.0
epoch:176-step:0:loss:0.2807330787181854, acc:1.0
epoch:176-step:100:loss:0.23456084728240967, acc:1.0
epoch:176-step:200:loss:0.2887047827243805, acc:1.0
epoch:176-step:300:loss:0.2812623381614685, acc:1.0
epoch:176-step:400:loss:0.3158397376537323, acc:1.0
epoch:177-step:0:loss:0.23738829791545868, acc:1.0
epoch:177-step:100:loss:0.24322590231895447, acc:1.0
epoch:177-step:200:loss:0.22160747647285461, acc:1.0
epoch:177-step:300:loss:0.24349471926689148, acc:1.0
epoch:177-step:400:loss:0.21757575869560242, acc:1.0
epoch:178-step:0:loss:0.2435133010149002, acc:1.0
epoch:178-step:100:loss:0.24062594771385193, acc:1.0
epoch:178-step:200:loss:0.22011765837669373, acc:1.0
epoch:178-step:300:loss:0.22701475024223328, acc:1.0
epoch:178-step:400:loss:0.28642338514328003, acc:1.0
epoch:179-step:0:loss:0.3613664507865906, acc:1.0
epoch:179-step:100:loss:0.3063124418258667, acc:1.0
epoch:179-step:200:loss:0.2386787384748459, acc:1.0
epoch:179-step:300:loss:0.19619396328926086, acc:1.0
epoch:179-step:400:loss:0.230968177318573, acc:1.0
epoch:180-step:0:loss:0.24858954548835754, acc:1.0
epoch:180-step:100:loss:0.26768505573272705, acc:1.0
epoch:180-step:200:loss:0.29797977209091187, acc:1.0
epoch:180-step:300:loss:0.24379199743270874, acc:1.0
epoch:180-step:400:loss:0.20525065064430237, acc:1.0
epoch:181-step:0:loss:0.19562023878097534, acc:1.0
epoch:181-step:100:loss:0.2581019997596741, acc:1.0
epoch:181-step:200:loss:0.1948176622390747, acc:1.0
epoch:181-step:300:loss:0.235663503408432, acc:1.0
epoch:181-step:400:loss:0.2607015371322632, acc:1.0
epoch:182-step:0:loss:0.23931723833084106, acc:1.0
epoch:182-step:100:loss:0.2231079638004303, acc:1.0
epoch:182-step:200:loss:0.24206584692001343, acc:1.0
epoch:182-step:300:loss:0.30963870882987976, acc:1.0
epoch:182-step:400:loss:0.23133286833763123, acc:1.0
epoch:183-step:0:loss:0.27656325697898865, acc:1.0
epoch:183-step:100:loss:0.26249751448631287, acc:1.0
epoch:183-step:200:loss:0.21695353090763092, acc:1.0
epoch:183-step:300:loss:0.26324737071990967, acc:1.0
epoch:183-step:400:loss:0.2711912989616394, acc:1.0
epoch:184-step:0:loss:0.23702189326286316, acc:1.0
epoch:184-step:100:loss:0.22966524958610535, acc:1.0
epoch:184-step:200:loss:0.29623645544052124, acc:1.0
epoch:184-step:300:loss:0.30044808983802795, acc:1.0
epoch:184-step:400:loss:0.24111399054527283, acc:1.0
epoch:185-step:0:loss:0.22278079390525818, acc:1.0
epoch:185-step:100:loss:0.1954064965248108, acc:1.0
epoch:185-step:200:loss:0.2555196285247803, acc:1.0
epoch:185-step:300:loss:0.2318498194217682, acc:1.0
epoch:185-step:400:loss:0.23440298438072205, acc:1.0
epoch:186-step:0:loss:0.23951219022274017, acc:1.0
epoch:186-step:100:loss:0.23118996620178223, acc:1.0
epoch:186-step:200:loss:0.20290729403495789, acc:1.0
epoch:186-step:300:loss:0.21030914783477783, acc:1.0
epoch:186-step:400:loss:0.31030842661857605, acc:1.0
epoch:187-step:0:loss:0.27635711431503296, acc:1.0
epoch:187-step:100:loss:0.22451351583003998, acc:1.0
epoch:187-step:200:loss:0.23297153413295746, acc:1.0
epoch:187-step:300:loss:0.24361850321292877, acc:1.0
epoch:187-step:400:loss:0.25533121824264526, acc:1.0
epoch:188-step:0:loss:0.2289050966501236, acc:1.0
epoch:188-step:100:loss:0.30923235416412354, acc:1.0
epoch:188-step:200:loss:0.25695306062698364, acc:1.0
epoch:188-step:300:loss:0.2502107620239258, acc:1.0
epoch:188-step:400:loss:0.310401976108551, acc:1.0
epoch:189-step:0:loss:0.27332741022109985, acc:1.0
epoch:189-step:100:loss:0.22690905630588531, acc:1.0
epoch:189-step:200:loss:0.20656943321228027, acc:1.0
epoch:189-step:300:loss:0.27106964588165283, acc:1.0
epoch:189-step:400:loss:0.20066089928150177, acc:1.0
epoch:190-step:0:loss:0.29217952489852905, acc:1.0
epoch:190-step:100:loss:0.1723894625902176, acc:1.0
epoch:190-step:200:loss:0.2992399036884308, acc:1.0
epoch:190-step:300:loss:0.3227115869522095, acc:1.0
epoch:190-step:400:loss:0.2844695746898651, acc:1.0
epoch:191-step:0:loss:0.25385862588882446, acc:1.0
epoch:191-step:100:loss:0.2833586037158966, acc:1.0
epoch:191-step:200:loss:0.28818970918655396, acc:1.0
epoch:191-step:300:loss:0.34567874670028687, acc:1.0
epoch:191-step:400:loss:0.22349950671195984, acc:1.0
epoch:192-step:0:loss:0.2449677586555481, acc:1.0
epoch:192-step:100:loss:0.2627030611038208, acc:1.0
epoch:192-step:200:loss:0.20338872075080872, acc:1.0
epoch:192-step:300:loss:0.2795126438140869, acc:1.0
epoch:192-step:400:loss:0.2432887703180313, acc:1.0
epoch:193-step:0:loss:0.27345022559165955, acc:1.0
epoch:193-step:100:loss:0.2250233292579651, acc:1.0
epoch:193-step:200:loss:0.1994568407535553, acc:1.0
epoch:193-step:300:loss:0.22342205047607422, acc:1.0
epoch:193-step:400:loss:0.29314282536506653, acc:1.0
epoch:194-step:0:loss:0.21798376739025116, acc:1.0
epoch:194-step:100:loss:0.28248468041419983, acc:1.0
epoch:194-step:200:loss:0.24595868587493896, acc:1.0
epoch:194-step:300:loss:0.20048722624778748, acc:1.0
epoch:194-step:400:loss:0.204207643866539, acc:1.0
epoch:195-step:0:loss:0.2799276113510132, acc:1.0
epoch:195-step:100:loss:0.284476637840271, acc:1.0
epoch:195-step:200:loss:0.2759615182876587, acc:1.0
epoch:195-step:300:loss:0.2434876412153244, acc:1.0
epoch:195-step:400:loss:0.22770613431930542, acc:1.0
epoch:196-step:0:loss:0.21129807829856873, acc:1.0
epoch:196-step:100:loss:0.23886273801326752, acc:1.0
epoch:196-step:200:loss:0.30786073207855225, acc:1.0
epoch:196-step:300:loss:0.29156267642974854, acc:1.0
epoch:196-step:400:loss:0.19629065692424774, acc:1.0
epoch:197-step:0:loss:0.21290118992328644, acc:1.0
epoch:197-step:100:loss:0.2156652808189392, acc:1.0
epoch:197-step:200:loss:0.2511345148086548, acc:1.0
epoch:197-step:300:loss:0.256051242351532, acc:1.0
epoch:197-step:400:loss:0.25338178873062134, acc:1.0
epoch:198-step:0:loss:0.24391868710517883, acc:1.0
epoch:198-step:100:loss:0.34505218267440796, acc:1.0
epoch:198-step:200:loss:0.2599259316921234, acc:1.0
epoch:198-step:300:loss:0.26231515407562256, acc:1.0
epoch:198-step:400:loss:0.28071051836013794, acc:1.0
epoch:199-step:0:loss:0.2182619571685791, acc:1.0
epoch:199-step:100:loss:0.22599707543849945, acc:1.0
epoch:199-step:200:loss:0.24307340383529663, acc:1.0
epoch:199-step:300:loss:0.2532089948654175, acc:1.0
epoch:199-step:400:loss:0.23235024511814117, acc:1.0
In [740]:
def inference_batch(sess, input_data, label_data, index_from, index_end):
    batch_x = input_data[index_from:index_end]
    batch_y = label_data[index_from:index_end]    
    return  sess.run([pred_correct, output, loss, acc_op], feed_dict={input_layer: batch_x, label_layer: batch_y})    

def inference(input_data, label_data):
    BATCH_SIZE = 64
    result_total = []
    score_total = []
    loss_total = []
    acc_total = []

    with tf.Session() as sess:
        total_sample = label_data.shape[0]
        saver = tf.train.Saver()
        saver.restore(sess, save_path='/source/SDC_Andrew/SDC/T1/traffic-signs/model3.ckpt')

        steps = total_sample // BATCH_SIZE
        last_batch = total_sample - steps * BATCH_SIZE    

        for step in range(steps):
            result_t, score_t, loss_t, acc_t = inference_batch(sess, input_data, label_data, step*BATCH_SIZE, (step+1)*BATCH_SIZE)
            result_total.append(result_t)
            score_total.append(score_t)
            loss_total.append(loss_t)
            acc_total.append(acc_t)

        result_t, score_t, loss_t, acc_t = inference_batch(sess, input_data, label_data, steps*BATCH_SIZE, total_sample)
        result_total.append(result_t)
        score_total.append(score_t)
        loss_total.append(loss_t)
        acc_total.append(acc_t)
        
    return [ele for batch in result_total for ele in batch], [ele for batch in score_total for ele in batch], np.mean(loss_total), np.mean(acc_total)

result_test, score_test, loss_test, acc_test = inference(X_test_norm, y_test)
In [741]:
from sklearn.metrics import confusion_matrix
import seaborn
score_test = np.array(score_test)
print("loss:{0}, acc:{1}".format(loss_test, acc_test))
y_test_pred = np.argmax(score_test, axis=1)
cm = confusion_matrix(y_test, y_test_pred)
plt.figure(figsize = (15,10))
seaborn.heatmap(cm)
plt.show()

#dump top 49 highest cost prediction
cost = list(enumerate(-np.log(score_test[range(y_test.shape[0]), y_test])))

def show_images(data, label, label_pred, cost, max_count = -1):
    if max_count == -1:
        counts = data.shape[0]
    else:
        counts = max_count
        
    col = 7
    row = int(np.ceil(counts / col))
    cost = cost[:counts]
    plt.figure(figsize = (15, 3*row))

    for i, (j, c) in enumerate(cost):
        r = int(i/col)
        c = i - col*r
        plt.subplot(col, row, i+1)
        plt.axis('off')
        index = j
        plt.imshow(data[index, :, :])
        plt.title("{0}->{1} ({2})".format(label[index], label_pred[index], index))
        
    plt.show()

cost_test_wrong = list(filter(lambda pair: y_test_pred[pair[0]] != y_test[pair[0]], cost))
cost_test_wrong.sort(key = lambda pair: pair[1], reverse = True)
show_images(X_test, y_test, y_test_pred, cost_test_wrong, 49)
loss:0.3918231129646301, acc:0.9716841578483582

Question 4

How did you train your model? (Type of optimizer, batch size, epochs, hyperparameters, etc.)

Answer: I used AdamOptimizer with learning_rate=0.001. Batch size is 64 and the epochs is 200

Question 5

What approach did you take in coming up with a solution to this problem?

Answer: DenseNet has L-l subsequence layers with CNN then following by composition function (Batch Normalization, ReLU) and Pooling. Due to the memory limition, the training procedure will apply SGD, randomly select 64 (batch size) samples from training data to train the model by gradient decent. I adopt the 0.001 learning rate. Object function is minimizing loss function that is cross entropy of the output layer (softmax). The architecture could achieve 100% accuracy on training set. (loss: 0.2~0.3) Becuase DenseNet & AdamOptimizer is quite good, I only select learning rate from (0.001, 0.0005) by validation data set. The final result seems very similar but 0.001 is faster to converge.


Step 3: Test a Model on New Images

Take several pictures of traffic signs that you find on the web or around you (at least five), and run them through your classifier on your computer to produce example results. The classifier might not recognize some local signs but it could prove interesting nonetheless.

You may find signnames.csv useful as it contains mappings from the class id (integer) to the actual sign name.

Implementation

Use the code cell (or multiple code cells, if necessary) to implement the first step of your project. Once you have completed your implementation and are satisfied with the results, be sure to thoroughly answer the questions that follow.

In [737]:
### Load the images and plot them here.
### Feel free to use as many code cells as needed.
from scipy import misc
import os

#most samples are collected from:
#http://www.gettingaroundgermany.info/zeichen.shtml

test_dir = './test_sample'

def load_new_images(img_dir, sign_names):
    new_images = []
    new_labels = []

    for root, dirs, files in os.walk(img_dir):
        for i, fn in enumerate(files):
            fp = os.path.join(root, fn)
            img = misc.imread(fp)
            label = int(os.path.splitext(fn)[0])
            new_labels.append(label)
            img = misc.imresize(img, image_shape)
            new_images.append(img)

    return new_images, new_labels

new_images, new_labels = load_new_images(test_dir, sign_names)
new_images = np.array(new_images)
new_labels = np.array(new_labels)

result_new, score_new, loss_new, acc_new = inference(preprocess(new_images.astype(np.float64)), new_labels)
In [738]:
import matplotlib.ticker as mtick
from matplotlib.ticker import FuncFormatter

def show_images_with_prop(images, labels, signames, results, scores):
    col = 4
    row = int(np.ceil(images.shape[0] / col))
    plt.figure(figsize=(15, 4*row))
    
    for i in range(images.shape[0]):
        s = list(enumerate(scores[i]))
        s.sort(key = lambda pair:pair[1], reverse=True)
        r = i//(col)
        c = i%col+1

        i1 = r*col*2+c
        i2 = r*col*2+c+col

        img = images[i,:,:,:]
        plt.subplot(row*2, col, i1)
        plt.axis('off')
        plt.title("{0:2d} {1}".format(labels[i], signames[labels[i]]))
        plt.imshow(img)

        plt.subplot(row*2, col, i2)
        #plt.xlabel("confidence")
        #plt.ylabel("category")
        prop = [p for i,p in s]
        cat = [i for i,p in s]
        plt.title("{0:2d} {1}".format(s[0][0], signames[s[0][0]]))
        plt.barh(cat, prop)

    plt.tight_layout()
    plt.show()

print("loss: {0}, acc: {1}".format(loss_new, acc_new))
loss: 0.154750257730484, acc: 1.0

Question 6

Choose five candidate images of traffic signs and provide them in the report. Are there any particular qualities of the image(s) that might make classification difficult? It would be helpful to plot the images in the notebook.

Answer:

In new captured data set, because all the samples are good quality, the accuracy is 100%, no diffcult to classify. In test data set, the following images are diffcult to be cliassificated: 496, 10845, 9880, 310, 9434

In [706]:
diffcult_index = [496, 10845, 9880, 310, 9434]
diffcult_images = X_test[diffcult_test_sample_index]
diffcult_labels = y_test[diffcult_test_sample_index]
diffcult_result = np.array(result_test)[diffcult_test_sample_index]
diffcult_score = score_test[diffcult_test_sample_index]
show_images_with_prop(diffcult_images, diffcult_labels, sign_names, diffcult_result, diffcult_score)

Question 7

Is your model able to perform equally well on captured pictures when compared to testing on the dataset?

Answer: Yes, my model is able to perform equally well on captured pictures (100% )as the test dataset (97%).

In [707]:
### Visualize the softmax probabilities here.
### Feel free to use as many code cells as needed.
show_images_with_prop(new_images, new_labels, sign_names, result_new, score_new)

Question 8

Use the model's softmax probabilities to visualize the certainty of its predictions, tf.nn.top_k could prove helpful here. Which predictions is the model certain of? Uncertain? If the model was incorrect in its initial prediction, does the correct prediction appear in the top k? (k should be 5 at most)

Answer: In new captured data set, the accuracy is 100%, except "Slippery road" (23), all predictions are certain and correct. In test data set, the worstest prediction (larger loss) is very confident but wrong. On the other hand, in best wrong predictions (smaller loss), almost the second confident one is the correct answer. All top-5 prediction in the wrong cases are printed at the following. Following by visualization for the wrong predictions on test data set.

In [756]:
cost_test_wrong.sort(key = lambda pair: pair[1], reverse = False)

for i, c in cost_test_wrong:
    score_ = list(enumerate(score_test[i]))
    score_.sort(key = lambda pair: pair[1], reverse = True)
    msg = "{0:5} {1:3} -> ".format(i, y_test[i])
    for j in range(5):
        msg += "{0:3} ({1:.2f})".format(score_[j][0], score_[j][1])
    print(msg)
12276  42 ->  41 (0.49) 42 (0.46)  3 (0.01) 38 (0.00)  6 (0.00)
 1560  42 ->  41 (0.58) 42 (0.34) 18 (0.01) 12 (0.01)  7 (0.00)
12105  19 ->  21 (0.59) 19 (0.34) 17 (0.01) 23 (0.01) 12 (0.01)
 4732  19 ->  23 (0.62) 19 (0.33)  8 (0.01) 34 (0.00)  2 (0.00)
 5972  38 ->  34 (0.60) 38 (0.31) 13 (0.01) 18 (0.01) 12 (0.01)
 1307  19 ->  23 (0.42) 19 (0.31)  2 (0.02)  1 (0.02) 10 (0.02)
 6210  19 ->  23 (0.35) 19 (0.30)  2 (0.03) 13 (0.02)  1 (0.02)
 3591   7 ->   5 (0.34)  7 (0.28)  4 (0.04)  2 (0.02)  1 (0.02)
 6183  38 ->  34 (0.63) 38 (0.25) 11 (0.03)  1 (0.01) 42 (0.01)
10135   3 ->   5 (0.62)  3 (0.25) 18 (0.01) 13 (0.01)  2 (0.01)
 2191   5 ->   2 (0.47)  5 (0.22) 13 (0.03) 38 (0.02)  1 (0.02)
 6766  18 ->  31 (0.50) 18 (0.18)  1 (0.03)  2 (0.02) 13 (0.02)
 7886  18 ->  22 (0.25) 18 (0.18) 13 (0.04)  2 (0.04) 10 (0.04)
 8821  42 ->  41 (0.58) 42 (0.17) 38 (0.02)  2 (0.02) 12 (0.02)
 2924   5 ->   1 (0.55)  5 (0.16)  2 (0.02) 13 (0.02) 12 (0.02)
 8736   5 ->  20 (0.53)  5 (0.15) 10 (0.03)  2 (0.02)  8 (0.02)
12082  10 ->  31 (0.40) 10 (0.15)  2 (0.04)  1 (0.03)  5 (0.03)
   49   8 ->   1 (0.55)  8 (0.15)  2 (0.03) 38 (0.02) 12 (0.02)
11183  26 ->   8 (0.42) 26 (0.14)  1 (0.03)  4 (0.03)  2 (0.03)
 7003   7 ->   8 (0.39)  7 (0.14)  2 (0.04)  1 (0.04) 38 (0.03)
 4093   1 ->   7 (0.50)  1 (0.13)  2 (0.03) 13 (0.03) 12 (0.03)
12481   5 ->   7 (0.60)  5 (0.13) 13 (0.02)  2 (0.02)  4 (0.02)
 3296   3 ->   5 (0.72)  3 (0.13) 38 (0.02)  2 (0.01) 13 (0.01)
 8663  12 ->  32 (0.66) 12 (0.13)  1 (0.02) 13 (0.02)  2 (0.02)
 8544  39 ->  13 (0.24) 39 (0.12) 38 (0.06)  2 (0.05)  1 (0.05)
 7714  25 ->  11 (0.37) 25 (0.12) 13 (0.05)  2 (0.04) 10 (0.03)
12546  24 ->  29 (0.76) 24 (0.11)  4 (0.01)  1 (0.01)  2 (0.01)
 2025   4 ->  15 (0.14)  4 (0.11) 14 (0.08) 13 (0.06) 33 (0.05)
 7279  12 ->  32 (0.18) 12 (0.10) 38 (0.06)  1 (0.06)  2 (0.06)
 3056  19 ->  23 (0.83) 19 (0.10)  7 (0.01)  2 (0.01) 15 (0.01)
 7593  26 ->  18 (0.20) 26 (0.10)  1 (0.07)  2 (0.06) 13 (0.05)
 6582  10 ->  42 (0.79) 10 (0.10)  1 (0.01) 13 (0.01)  3 (0.01)
  637  29 ->  23 (0.12) 29 (0.10) 38 (0.06)  2 (0.06)  1 (0.05)
  781  39 ->  33 (0.54) 39 (0.10) 13 (0.05)  9 (0.03)  1 (0.02)
10187  19 ->  23 (0.60) 19 (0.10)  2 (0.02)  1 (0.02) 38 (0.02)
  145  25 ->   2 (0.11) 25 (0.09) 13 (0.08)  1 (0.06) 10 (0.06)
12040  26 ->  25 (0.19) 13 (0.11) 26 (0.09)  2 (0.05)  4 (0.05)
11508   8 ->   5 (0.16) 13 (0.13)  8 (0.09)  9 (0.08)  1 (0.05)
11817  21 ->  25 (0.59) 21 (0.08) 12 (0.03)  2 (0.02)  1 (0.02)
 9769  30 ->  34 (0.46) 30 (0.08)  1 (0.04)  2 (0.03) 38 (0.03)
10028  18 ->  22 (0.50) 18 (0.08) 13 (0.03)  2 (0.03)  4 (0.03)
11238   8 ->   1 (0.18)  5 (0.14)  8 (0.08)  2 (0.05) 13 (0.05)
11089  18 ->  11 (0.42) 31 (0.09) 18 (0.07)  1 (0.04) 13 (0.03)
 4377   1 ->   4 (0.54)  1 (0.07) 13 (0.03)  2 (0.03) 10 (0.03)
 8144  12 ->  38 (0.22) 13 (0.07) 12 (0.07)  2 (0.06)  1 (0.05)
 7468   3 ->   5 (0.79)  3 (0.07) 38 (0.01) 13 (0.01)  2 (0.01)
10384   9 ->  10 (0.58)  9 (0.06) 13 (0.03) 38 (0.03)  2 (0.03)
 6129  38 ->  34 (0.83) 38 (0.06) 18 (0.01)  1 (0.01) 35 (0.01)
12627   6 ->  38 (0.26) 32 (0.12)  1 (0.09)  2 (0.07)  6 (0.06)
 4886  18 ->  31 (0.22)  1 (0.07) 18 (0.06) 10 (0.05) 12 (0.04)
 5149  18 ->  27 (0.31) 31 (0.10) 35 (0.07) 18 (0.06) 10 (0.04)
 9613  12 ->  32 (0.22)  1 (0.15)  2 (0.05) 12 (0.05) 10 (0.04)
10708   4 ->  25 (0.36)  1 (0.10)  4 (0.05)  2 (0.04)  5 (0.04)
 5625  10 ->   3 (0.12) 29 (0.08)  1 (0.07) 38 (0.07) 12 (0.05)
 5332  25 ->  31 (0.35) 13 (0.05) 25 (0.05)  2 (0.04)  4 (0.04)
  603  19 ->  23 (0.84) 19 (0.05) 31 (0.01) 15 (0.01) 18 (0.01)
 6416  10 ->   3 (0.37) 28 (0.08) 12 (0.07) 10 (0.05) 38 (0.04)
12339  22 ->  20 (0.25) 13 (0.06) 12 (0.05)  2 (0.05) 10 (0.05)
 5307   2 ->   5 (0.70)  2 (0.04) 13 (0.02)  1 (0.02)  4 (0.02)
10714  18 ->  31 (0.87) 18 (0.04) 13 (0.01)  1 (0.01) 10 (0.01)
 1573   3 ->   5 (0.43)  2 (0.12) 13 (0.04)  3 (0.04)  1 (0.03)
 4803   4 ->  12 (0.49)  9 (0.05)  4 (0.04)  2 (0.03) 13 (0.03)
 7276  10 ->   1 (0.44) 13 (0.04)  2 (0.04) 12 (0.04)  5 (0.04)
11128   6 ->   5 (0.37) 38 (0.07)  1 (0.04) 12 (0.04)  2 (0.04)
10866  25 ->  27 (0.90) 25 (0.03) 13 (0.01) 38 (0.01)  2 (0.00)
 7072  30 ->  11 (0.40) 13 (0.05) 12 (0.04)  2 (0.04)  4 (0.04)
 9733   4 ->  12 (0.49)  1 (0.04)  2 (0.04) 38 (0.04) 10 (0.03)
  265  12 ->  32 (0.47) 38 (0.04)  1 (0.04)  2 (0.04) 13 (0.03)
12363   8 ->   5 (0.24) 20 (0.15)  2 (0.05)  4 (0.04) 12 (0.04)
 1317  25 ->  11 (0.74) 25 (0.03)  2 (0.02) 12 (0.02) 13 (0.02)
 3303   6 ->   5 (0.76) 38 (0.06)  6 (0.03)  4 (0.01)  1 (0.01)
12174   4 ->   1 (0.70)  4 (0.03)  2 (0.02) 10 (0.02) 38 (0.02)
11543  12 ->   1 (0.66)  2 (0.03) 12 (0.03)  4 (0.02) 13 (0.02)
  908  38 ->  34 (0.39) 10 (0.09)  2 (0.04) 13 (0.03)  1 (0.03)
 2836   5 ->   1 (0.57)  4 (0.04) 13 (0.03) 12 (0.03) 38 (0.03)
 1747  12 ->  15 (0.62)  1 (0.03) 12 (0.03)  2 (0.02) 38 (0.02)
 6090  30 ->  24 (0.89) 30 (0.02)  1 (0.01) 10 (0.00) 38 (0.00)
  198  18 ->  31 (0.43)  1 (0.05) 13 (0.04) 10 (0.04)  2 (0.04)
 5227  11 ->  29 (0.88) 11 (0.02) 33 (0.01)  8 (0.01) 10 (0.01)
10480  18 ->  31 (0.49) 12 (0.04) 10 (0.04)  2 (0.04) 13 (0.04)
 2907  17 ->  12 (0.78) 17 (0.02)  1 (0.02) 13 (0.01)  2 (0.01)
 8517   5 ->   1 (0.67)  2 (0.03) 12 (0.02) 13 (0.02) 10 (0.02)
 8869  39 ->  13 (0.47)  2 (0.04) 38 (0.04) 10 (0.04)  4 (0.03)
 4639  30 ->  34 (0.89) 30 (0.02)  3 (0.01)  9 (0.01) 17 (0.01)
 8991  19 ->  23 (0.31)  2 (0.05)  1 (0.05) 10 (0.04) 38 (0.04)
11132  25 ->  27 (0.90) 25 (0.02)  2 (0.01) 13 (0.01) 10 (0.01)
 1781  38 ->  34 (0.72)  2 (0.02)  1 (0.02) 10 (0.02) 38 (0.02)
10067  25 ->  27 (0.60)  2 (0.03) 10 (0.03)  1 (0.03)  5 (0.02)
11141   4 ->   7 (0.72) 38 (0.02)  1 (0.02)  2 (0.02) 13 (0.02)
 4945  19 ->  23 (0.47)  3 (0.06) 33 (0.04)  4 (0.04)  6 (0.03)
 4601   4 ->  20 (0.64) 12 (0.02)  2 (0.02) 13 (0.02) 38 (0.02)
 1200   5 ->   2 (0.79)  1 (0.02)  5 (0.02) 38 (0.01)  4 (0.01)
 6324   6 ->   5 (0.79)  4 (0.02)  1 (0.02)  6 (0.02)  2 (0.02)
 3173  38 ->  34 (0.73)  2 (0.02)  1 (0.02) 38 (0.02) 12 (0.02)
 6874   3 ->   5 (0.61)  2 (0.03)  1 (0.03) 38 (0.03) 12 (0.02)
11980   8 ->   3 (0.52)  1 (0.04)  2 (0.03) 13 (0.03)  4 (0.03)
 6778  39 ->  33 (0.29) 17 (0.11) 13 (0.06)  2 (0.04)  8 (0.04)
 8332  31 ->   5 (0.66)  2 (0.04) 13 (0.03)  3 (0.02) 10 (0.02)
10477  18 ->  31 (0.57)  2 (0.03) 12 (0.03) 10 (0.03) 13 (0.03)
10524  25 ->  11 (0.69)  1 (0.02) 13 (0.02) 12 (0.02)  2 (0.02)
   50  25 ->   1 (0.81) 25 (0.01)  2 (0.01) 13 (0.01) 12 (0.01)
10546   1 ->   6 (0.73) 40 (0.09)  8 (0.05)  1 (0.01)  5 (0.01)
 8460  21 ->  12 (0.91) 21 (0.01) 25 (0.01)  1 (0.01)  2 (0.00)
11650   3 ->   5 (0.66)  2 (0.02) 13 (0.02) 10 (0.02)  1 (0.02)
 9330  25 ->  26 (0.73) 27 (0.14) 19 (0.06) 25 (0.01)  9 (0.00)
 8355  31 ->   2 (0.32)  1 (0.19) 38 (0.04)  3 (0.03) 10 (0.03)
 2577   4 ->   1 (0.75) 13 (0.03) 25 (0.02) 12 (0.02)  2 (0.02)
 3552  19 ->  23 (0.58) 25 (0.06)  3 (0.06) 26 (0.04) 31 (0.04)
 5476  11 ->  27 (0.47) 20 (0.07) 30 (0.06)  7 (0.03)  1 (0.03)
 3860  25 ->  11 (0.70) 13 (0.02)  2 (0.02) 10 (0.02)  4 (0.02)
 1819   6 ->   5 (0.58) 38 (0.03)  1 (0.03)  2 (0.03)  8 (0.03)
11170  21 ->  31 (0.65) 12 (0.04) 25 (0.02) 10 (0.02)  2 (0.02)
 7671   5 ->   1 (0.83)  5 (0.01) 10 (0.01) 13 (0.01)  4 (0.01)
 6613   3 ->   5 (0.68)  2 (0.02)  1 (0.02) 38 (0.02) 12 (0.02)
  235  19 ->  23 (0.87) 35 (0.02)  7 (0.01) 19 (0.01)  3 (0.01)
 4983  12 ->  15 (0.80)  1 (0.01)  4 (0.01) 38 (0.01) 12 (0.01)
 5872  12 ->  25 (0.81)  1 (0.02) 38 (0.01) 13 (0.01) 12 (0.01)
 1660   7 ->   8 (0.71)  2 (0.02)  1 (0.02) 13 (0.02) 12 (0.02)
12032  21 ->  11 (0.63)  8 (0.05) 30 (0.04) 14 (0.04) 22 (0.02)
 4716   7 ->   5 (0.76)  1 (0.02)  2 (0.02) 13 (0.02) 38 (0.02)
12608  18 ->  26 (0.26) 27 (0.13)  3 (0.07)  2 (0.05) 36 (0.05)
 8788   7 ->   8 (0.71)  2 (0.02) 38 (0.02)  1 (0.02) 13 (0.02)
 9087  18 ->  20 (0.63) 13 (0.03) 25 (0.03)  2 (0.02)  1 (0.02)
10057   5 ->   2 (0.81)  1 (0.02) 13 (0.02)  5 (0.01) 12 (0.01)
 6878  18 ->  31 (0.69)  1 (0.02)  2 (0.02) 38 (0.02) 12 (0.02)
 1597  26 ->  18 (0.11) 12 (0.08) 28 (0.06) 13 (0.05)  1 (0.05)
11438   6 ->  12 (0.59)  5 (0.06)  1 (0.04)  2 (0.03) 38 (0.03)
11625  26 ->   8 (0.33) 17 (0.08)  1 (0.05)  2 (0.04) 12 (0.04)
 5562  26 ->   8 (0.52) 30 (0.06)  2 (0.03) 12 (0.03)  1 (0.03)
 2394   7 ->   1 (0.74) 13 (0.02)  2 (0.02) 12 (0.02) 10 (0.02)
 7330   7 ->   8 (0.72)  2 (0.02)  1 (0.02) 38 (0.02) 13 (0.02)
 3960  18 ->  27 (0.91) 18 (0.01)  5 (0.01)  2 (0.01) 13 (0.01)
 6255   6 ->   5 (0.54)  1 (0.04) 38 (0.03) 12 (0.03) 13 (0.03)
11756  12 ->  15 (0.80)  2 (0.02)  1 (0.01) 13 (0.01) 10 (0.01)
 5182  12 ->  15 (0.80)  2 (0.01)  1 (0.01)  7 (0.01) 13 (0.01)
 5565   8 ->  20 (0.91)  5 (0.01)  8 (0.01)  2 (0.01) 12 (0.00)
 7957   8 ->  13 (0.75)  2 (0.02)  1 (0.02)  4 (0.02) 10 (0.02)
 7725  18 ->  31 (0.81) 27 (0.02) 10 (0.01)  5 (0.01)  2 (0.01)
  939  12 ->  32 (0.89) 12 (0.01) 13 (0.01)  1 (0.01)  2 (0.01)
 3086   8 ->   5 (0.82) 12 (0.01)  1 (0.01) 13 (0.01)  2 (0.01)
10675  25 ->  21 (0.92) 25 (0.01) 33 (0.01)  5 (0.00) 38 (0.00)
 7867   4 ->  13 (0.88)  1 (0.01)  4 (0.01)  2 (0.01) 38 (0.01)
 8421   4 ->   7 (0.78)  1 (0.02)  2 (0.02) 38 (0.02) 10 (0.02)
  335   6 ->   5 (0.60)  1 (0.04)  4 (0.03)  2 (0.03) 13 (0.02)
11564  30 ->   3 (0.22) 23 (0.09) 25 (0.08) 31 (0.04) 29 (0.04)
10782  18 ->  22 (0.73) 13 (0.02)  2 (0.02)  1 (0.02)  4 (0.02)
 8702  26 ->   8 (0.40)  1 (0.05)  2 (0.05) 12 (0.05) 13 (0.04)
 6000   1 ->   4 (0.84) 38 (0.01) 12 (0.01)  2 (0.01) 13 (0.01)
11964   7 ->   8 (0.78)  2 (0.02)  1 (0.01) 13 (0.01) 38 (0.01)
 4575   8 ->  20 (0.77)  2 (0.01) 38 (0.01)  1 (0.01) 13 (0.01)
 7756   8 ->  20 (0.94)  8 (0.01)  5 (0.00)  1 (0.00)  2 (0.00)
 5977   7 ->   8 (0.80) 38 (0.01)  2 (0.01)  1 (0.01) 13 (0.01)
 4714  11 ->  18 (0.79) 13 (0.02)  1 (0.01)  2 (0.01) 10 (0.01)
 2494   2 ->   1 (0.84)  3 (0.01) 10 (0.01)  4 (0.01) 12 (0.01)
 9952  12 ->  32 (0.73) 15 (0.12) 34 (0.04) 25 (0.01) 12 (0.01)
 2228  26 ->   8 (0.55)  1 (0.03) 38 (0.03)  2 (0.03) 10 (0.03)
 9623   7 ->   5 (0.62) 13 (0.03)  2 (0.03) 38 (0.03) 10 (0.03)
  806  26 ->   8 (0.56) 17 (0.09) 12 (0.03)  1 (0.02) 10 (0.02)
 5521   6 ->  25 (0.48) 38 (0.05)  2 (0.04)  1 (0.04) 10 (0.03)
10564  39 ->  38 (0.20)  1 (0.09) 13 (0.07)  2 (0.07)  3 (0.07)
 4185  21 ->  12 (0.38) 25 (0.05) 10 (0.05)  2 (0.05) 38 (0.04)
 2109  30 ->  38 (0.17) 20 (0.15)  1 (0.12)  4 (0.06) 12 (0.06)
 6334  26 ->   8 (0.63)  2 (0.03)  1 (0.03) 10 (0.03) 12 (0.02)
12264  18 ->  27 (0.78) 26 (0.09) 25 (0.01) 13 (0.01) 11 (0.01)
 9254  18 ->  27 (0.92) 18 (0.01) 13 (0.01)  4 (0.00) 10 (0.00)
 3298  18 ->  31 (0.82)  2 (0.01) 13 (0.01)  1 (0.01) 12 (0.01)
 4443  21 ->  12 (0.34) 25 (0.07)  2 (0.05)  1 (0.04) 10 (0.04)
10931  12 ->  40 (0.93) 15 (0.01) 12 (0.01)  2 (0.00) 10 (0.00)
12226  26 ->   8 (0.63)  2 (0.03) 13 (0.03)  1 (0.03) 12 (0.02)
 6600   5 ->   1 (0.86) 13 (0.01)  2 (0.01)  4 (0.01) 12 (0.01)
12214  12 ->  32 (0.92) 12 (0.01)  1 (0.01) 10 (0.01) 15 (0.00)
11746  30 ->  24 (0.93)  2 (0.01) 30 (0.01) 10 (0.01) 11 (0.00)
11976  22 ->  20 (0.56) 25 (0.03) 13 (0.03) 12 (0.03)  1 (0.03)
 5967  19 ->  23 (0.77) 38 (0.02)  2 (0.01)  4 (0.01) 13 (0.01)
  492   4 ->   5 (0.91) 18 (0.01)  4 (0.01) 13 (0.01) 12 (0.01)
10586  31 ->   3 (0.75)  2 (0.02) 38 (0.02) 10 (0.02)  1 (0.02)
 3370  12 ->  32 (0.89)  1 (0.03) 38 (0.01)  2 (0.01) 12 (0.01)
 9095  24 ->  18 (0.60)  2 (0.04) 13 (0.04) 12 (0.03) 10 (0.02)
10324  12 ->  32 (0.90) 38 (0.02)  1 (0.01)  2 (0.01) 13 (0.01)
   33  23 ->  31 (0.95) 23 (0.01) 41 (0.00)  2 (0.00) 10 (0.00)
 4341  12 ->  32 (0.90) 17 (0.01)  1 (0.01)  2 (0.01) 10 (0.01)
10698  41 ->   9 (0.87) 13 (0.01) 38 (0.01)  4 (0.01)  1 (0.01)
11016   3 ->  13 (0.86)  2 (0.01) 18 (0.01) 10 (0.01)  4 (0.01)
 1210  18 ->  31 (0.86)  1 (0.01) 10 (0.01) 12 (0.01)  2 (0.01)
 3574  12 ->  15 (0.91)  3 (0.01) 25 (0.01)  1 (0.01) 34 (0.01)
 8918   6 ->  28 (0.53) 11 (0.10) 12 (0.05)  1 (0.03)  5 (0.03)
11070   6 ->  12 (0.15)  1 (0.10)  5 (0.08)  4 (0.07) 32 (0.06)
 8007  26 ->   8 (0.89)  1 (0.01)  2 (0.01) 12 (0.01) 13 (0.01)
 7098  25 ->  27 (0.91) 35 (0.01) 12 (0.01)  2 (0.01) 25 (0.00)
 8622  12 ->  32 (0.91) 25 (0.01)  2 (0.01) 12 (0.00)  1 (0.00)
 8549  30 ->  20 (0.92) 30 (0.00)  6 (0.00) 33 (0.00) 38 (0.00)
  114  30 ->  11 (0.71) 13 (0.02)  2 (0.02) 12 (0.02) 38 (0.02)
 4739   7 ->   1 (0.86)  2 (0.01) 10 (0.01)  5 (0.01) 12 (0.01)
 6356  26 ->   8 (0.70)  1 (0.02) 13 (0.02)  4 (0.02) 10 (0.02)
 3250  27 ->  25 (0.16) 28 (0.09) 11 (0.08)  1 (0.06)  2 (0.06)
11616  38 ->  39 (0.93) 12 (0.01) 38 (0.00)  5 (0.00)  9 (0.00)
11178  16 ->   9 (0.65)  2 (0.03)  1 (0.03) 10 (0.02)  4 (0.02)
  537   6 ->  28 (0.35) 30 (0.05)  5 (0.04)  9 (0.04) 11 (0.04)
 2379  12 ->  32 (0.90) 35 (0.01) 14 (0.01)  2 (0.00)  5 (0.00)
 1898  21 ->  12 (0.36) 11 (0.05)  3 (0.04) 13 (0.04)  2 (0.04)
 6960  17 ->  12 (0.86)  2 (0.01)  4 (0.01)  1 (0.01) 13 (0.01)
 6200  26 ->  25 (0.77)  2 (0.02) 13 (0.02)  1 (0.01)  4 (0.01)
 5529  12 ->   0 (0.93)  1 (0.01)  2 (0.00) 13 (0.00) 38 (0.00)
 8585  27 ->  28 (0.49)  2 (0.06)  1 (0.04) 12 (0.03) 13 (0.03)
 8051   3 ->   5 (0.90)  1 (0.01)  2 (0.01) 12 (0.01)  4 (0.00)
   87   7 ->   8 (0.89) 38 (0.01) 13 (0.01)  2 (0.01) 12 (0.01)
 3704  40 ->  35 (0.55) 12 (0.03)  2 (0.03) 10 (0.03)  1 (0.03)
12120  25 ->  11 (0.86)  9 (0.02)  4 (0.01) 10 (0.01)  2 (0.01)
  127  30 ->  24 (0.82) 26 (0.04) 13 (0.01) 12 (0.01)  2 (0.01)
 4219  13 ->  32 (0.91) 38 (0.01)  1 (0.01)  9 (0.01) 25 (0.01)
10173  26 ->   8 (0.80)  2 (0.02)  1 (0.02) 12 (0.01) 13 (0.01)
 4271   6 ->  32 (0.89) 30 (0.02) 33 (0.01) 28 (0.01)  2 (0.01)
 4889  19 ->  23 (0.73)  2 (0.02) 13 (0.02)  4 (0.02) 38 (0.02)
 2898  15 ->  32 (0.84)  5 (0.01) 38 (0.01)  3 (0.01)  2 (0.01)
12024  26 ->   8 (0.48)  2 (0.04)  1 (0.04) 12 (0.04) 38 (0.04)
 4634  26 ->   8 (0.71) 13 (0.02)  2 (0.02)  1 (0.02) 10 (0.02)
 5844  18 ->  27 (0.95)  4 (0.00) 13 (0.00) 10 (0.00)  5 (0.00)
 9484  10 ->  31 (0.95)  2 (0.00)  5 (0.00)  3 (0.00) 12 (0.00)
11643  18 ->  27 (0.92) 11 (0.01)  2 (0.00)  5 (0.00) 12 (0.00)
 3655   3 ->   5 (0.90) 38 (0.01) 11 (0.01)  1 (0.01) 13 (0.01)
 5637  21 ->  12 (0.33) 11 (0.07) 25 (0.06) 13 (0.05)  2 (0.05)
10367  27 ->  28 (0.34) 29 (0.13)  1 (0.09) 30 (0.05)  2 (0.04)
 8382  39 ->  33 (0.67)  2 (0.02)  1 (0.02)  4 (0.02) 10 (0.02)
 4160  11 ->  33 (0.87)  9 (0.05) 35 (0.01) 34 (0.01)  1 (0.01)
 6177   3 ->   5 (0.89) 38 (0.01)  1 (0.01)  2 (0.01) 11 (0.01)
 1402  30 ->  21 (0.93)  1 (0.01)  4 (0.00) 13 (0.00)  5 (0.00)
 6321  42 ->  41 (0.92) 13 (0.01)  5 (0.00)  9 (0.00)  2 (0.00)
 4314  26 ->   8 (0.81)  2 (0.01) 13 (0.01)  1 (0.01) 38 (0.01)
 7695  40 ->  37 (0.82) 15 (0.02) 35 (0.02) 14 (0.02) 18 (0.01)
  163  25 ->  27 (0.92) 12 (0.01)  2 (0.01)  5 (0.01) 38 (0.00)
 8187  26 ->   8 (0.81)  1 (0.01)  2 (0.01) 13 (0.01)  4 (0.01)
 2993  30 ->   8 (0.92) 38 (0.01)  2 (0.01)  1 (0.01) 12 (0.00)
  380  19 ->  23 (0.77)  2 (0.02)  1 (0.01) 13 (0.01) 10 (0.01)
 8587  26 ->   8 (0.70)  1 (0.02)  2 (0.02)  4 (0.02) 13 (0.02)
 4794  25 ->  27 (0.93)  2 (0.01) 10 (0.00) 12 (0.00)  1 (0.00)
 9222  21 ->  11 (0.52) 12 (0.11) 13 (0.04) 10 (0.03)  1 (0.03)
 6295  38 ->  34 (0.92)  9 (0.01) 13 (0.01) 18 (0.01) 14 (0.00)
 3744  12 ->  15 (0.94)  5 (0.01)  2 (0.00)  4 (0.00)  8 (0.00)
 1746  39 ->  38 (0.69)  1 (0.03)  5 (0.02)  2 (0.02) 13 (0.02)
10292  41 ->   9 (0.66)  2 (0.03) 13 (0.03)  1 (0.02) 12 (0.02)
 6418  36 ->  38 (0.75) 13 (0.02)  2 (0.02) 12 (0.02) 10 (0.01)
 4147  21 ->  12 (0.74) 10 (0.02)  1 (0.02) 13 (0.02) 38 (0.02)
 2980  41 ->   9 (0.68)  2 (0.03)  1 (0.02) 12 (0.02)  4 (0.02)
11142   5 ->  15 (0.95) 13 (0.01) 31 (0.00)  5 (0.00) 10 (0.00)
 1583  21 ->  12 (0.95) 18 (0.00) 31 (0.00) 13 (0.00)  1 (0.00)
 7142  26 ->   8 (0.71)  2 (0.02)  1 (0.02) 10 (0.02) 38 (0.02)
12396   3 ->  34 (0.91) 38 (0.01) 30 (0.01)  7 (0.01)  2 (0.00)
 3739  18 ->  27 (0.94) 13 (0.00)  1 (0.00) 10 (0.00)  2 (0.00)
 4585  27 ->  11 (0.55) 24 (0.04) 33 (0.04)  1 (0.03) 12 (0.03)
 4620  19 ->  31 (0.69)  3 (0.03) 13 (0.02)  1 (0.02) 10 (0.02)
10030  21 ->  12 (0.26) 11 (0.12) 13 (0.05)  2 (0.05)  1 (0.04)
12215  31 ->  20 (0.93) 33 (0.01) 12 (0.01) 13 (0.00) 10 (0.00)
11741  34 ->  38 (0.79) 13 (0.02) 12 (0.01)  1 (0.01)  5 (0.01)
12354  26 ->   8 (0.96) 42 (0.00) 22 (0.00) 14 (0.00) 16 (0.00)
  374  26 ->   8 (0.81)  1 (0.01)  2 (0.01) 13 (0.01)  4 (0.01)
 2112  18 ->  27 (0.90) 25 (0.01)  4 (0.01)  8 (0.01) 12 (0.01)
 8499   6 ->  32 (0.76) 38 (0.09)  5 (0.02)  2 (0.01)  1 (0.01)
 9270  21 ->  12 (0.75) 13 (0.02)  2 (0.02) 10 (0.02)  1 (0.02)
12008  22 ->  20 (0.79) 13 (0.01)  1 (0.01) 12 (0.01) 25 (0.01)
 3373  23 ->  21 (0.81) 28 (0.07) 24 (0.05)  8 (0.02) 13 (0.00)
 6739  25 ->  21 (0.93) 11 (0.01) 38 (0.00)  5 (0.00)  4 (0.00)
 1489  12 ->  15 (0.95)  4 (0.00) 25 (0.00) 10 (0.00) 33 (0.00)
 1643  19 ->  23 (0.59)  2 (0.03)  1 (0.03) 13 (0.03) 10 (0.03)
 4105  41 ->   9 (0.89)  2 (0.01)  5 (0.01) 13 (0.01) 38 (0.01)
 3562   5 ->   7 (0.95)  3 (0.00) 16 (0.00) 31 (0.00) 26 (0.00)
 2370  21 ->  12 (0.51) 11 (0.05) 13 (0.04) 10 (0.04)  2 (0.04)
 5507  25 ->  26 (0.96)  3 (0.00)  5 (0.00)  2 (0.00)  4 (0.00)
 5515  19 ->  23 (0.75)  2 (0.02)  1 (0.02) 13 (0.02) 10 (0.01)
  368  19 ->  23 (0.78)  2 (0.01)  1 (0.01) 13 (0.01)  4 (0.01)
 3564  27 ->  24 (0.39)  2 (0.05) 38 (0.04)  1 (0.04) 13 (0.04)
10205  26 ->   8 (0.81)  2 (0.01)  1 (0.01) 38 (0.01) 13 (0.01)
 2258  12 ->  15 (0.94)  8 (0.01) 31 (0.01) 18 (0.01) 26 (0.00)
 4252  18 ->  22 (0.95) 20 (0.00)  2 (0.00)  1 (0.00) 10 (0.00)
 8308  12 ->  15 (0.94) 10 (0.00) 17 (0.00) 13 (0.00) 25 (0.00)
 6920  24 ->  27 (0.44) 31 (0.17) 30 (0.03) 25 (0.03) 10 (0.02)
 1364   8 ->  20 (0.94)  1 (0.01)  4 (0.00) 10 (0.00) 13 (0.00)
 7997  18 ->  26 (0.96) 12 (0.00)  4 (0.00)  5 (0.00)  2 (0.00)
 9475  41 ->   9 (0.77)  2 (0.03) 13 (0.02) 12 (0.02) 10 (0.01)
 8422  27 ->  21 (0.69)  1 (0.02)  2 (0.02) 38 (0.02) 13 (0.02)
 3593  41 ->   9 (0.79) 13 (0.02)  4 (0.01)  2 (0.01)  1 (0.01)
 1763  41 ->   9 (0.90) 13 (0.01)  1 (0.01)  2 (0.01) 12 (0.01)
  924  19 ->  23 (0.77) 10 (0.02)  2 (0.02)  1 (0.01)  4 (0.01)
 6693  41 ->   9 (0.57)  2 (0.03) 13 (0.03) 38 (0.03)  1 (0.03)
 3567  30 ->  21 (0.92) 25 (0.01)  3 (0.01)  7 (0.00) 12 (0.00)
12518  18 ->  31 (0.96) 11 (0.01) 12 (0.00)  8 (0.00)  1 (0.00)
10204  21 ->  12 (0.62) 13 (0.03)  2 (0.03)  1 (0.03) 10 (0.03)
 6261  41 ->   9 (0.80)  2 (0.02)  1 (0.01) 13 (0.01) 10 (0.01)
11002  19 ->  23 (0.62)  2 (0.03) 13 (0.03)  4 (0.02) 38 (0.02)
12111  21 ->  12 (0.85) 10 (0.01)  2 (0.01)  1 (0.01) 13 (0.01)
 5693  27 ->  24 (0.49) 21 (0.20)  1 (0.03) 29 (0.02) 10 (0.02)
 1431  27 ->  24 (0.27) 11 (0.22)  2 (0.05)  1 (0.04) 13 (0.04)
 8398  19 ->  23 (0.77)  2 (0.02)  1 (0.01) 10 (0.01)  4 (0.01)
 6731  26 ->   8 (0.90) 38 (0.01)  5 (0.01) 11 (0.01)  1 (0.01)
 3109  39 ->  33 (0.82) 13 (0.01)  1 (0.01)  2 (0.01) 12 (0.01)
 6469   4 ->   7 (0.96) 10 (0.01) 25 (0.01) 26 (0.00)  6 (0.00)
 5667  27 ->  24 (0.32) 29 (0.22) 21 (0.15)  1 (0.03)  2 (0.02)
 4456  27 ->  24 (0.87) 30 (0.02) 29 (0.02)  1 (0.01) 11 (0.01)
 2388  31 ->  20 (0.94) 35 (0.01) 15 (0.00)  2 (0.00) 12 (0.00)
 6220  26 ->   8 (0.91)  2 (0.01) 11 (0.01)  1 (0.01) 10 (0.00)
 6862  39 ->  33 (0.94)  8 (0.01) 10 (0.00) 15 (0.00) 13 (0.00)
 9834  19 ->  23 (0.79)  1 (0.02) 13 (0.01)  2 (0.01)  4 (0.01)
 9609  36 ->  32 (0.85)  2 (0.01)  1 (0.01) 38 (0.01)  7 (0.01)
 6612  27 ->  24 (0.31)  2 (0.07) 13 (0.05)  1 (0.04) 38 (0.04)
 2810  19 ->  23 (0.94) 35 (0.01)  1 (0.01)  9 (0.00) 15 (0.00)
 9792  21 ->  12 (0.79)  1 (0.02) 10 (0.02)  2 (0.01) 13 (0.01)
 1910  22 ->  19 (0.89)  4 (0.02)  3 (0.01) 13 (0.01)  1 (0.01)
 2748  27 ->  21 (0.94)  8 (0.00) 13 (0.00)  4 (0.00)  2 (0.00)
 7597  31 ->  20 (0.95)  2 (0.00)  1 (0.00)  5 (0.00)  8 (0.00)
 7010  19 ->  23 (0.94)  3 (0.01) 13 (0.00)  2 (0.00) 28 (0.00)
 7147  31 ->  20 (0.86) 28 (0.08) 13 (0.00) 29 (0.00)  2 (0.00)
10328  18 ->  31 (0.97) 25 (0.00)  1 (0.00)  2 (0.00) 38 (0.00)
 5180  18 ->  31 (0.96)  7 (0.00)  9 (0.00)  1 (0.00)  2 (0.00)
 3523   6 ->  42 (0.89) 41 (0.01)  3 (0.01) 12 (0.00)  7 (0.00)
 6149  18 ->  31 (0.97)  1 (0.00)  2 (0.00) 11 (0.00)  5 (0.00)
  829  27 ->  24 (0.81) 29 (0.07) 18 (0.01) 30 (0.01) 11 (0.01)
 4643  19 ->  23 (0.93)  3 (0.01) 15 (0.00) 18 (0.00) 13 (0.00)
  176   3 ->   5 (0.95) 12 (0.01) 35 (0.00) 15 (0.00) 28 (0.00)
 5233  27 ->  21 (0.78) 24 (0.14)  8 (0.01) 33 (0.00) 32 (0.00)
 9835  21 ->  12 (0.77) 10 (0.02) 13 (0.02)  2 (0.02)  1 (0.02)
 1844   5 ->   1 (0.97)  3 (0.00) 39 (0.00) 26 (0.00)  2 (0.00)
11808  19 ->  23 (0.94) 34 (0.00) 15 (0.00) 22 (0.00) 14 (0.00)
 3727  26 ->   8 (0.96)  2 (0.00) 12 (0.00)  4 (0.00)  1 (0.00)
 5057  27 ->  21 (0.94)  2 (0.00)  4 (0.00) 13 (0.00)  3 (0.00)
 2057  36 ->  38 (0.89) 35 (0.01)  8 (0.01)  4 (0.01)  5 (0.01)
 2559  23 ->  21 (0.93)  2 (0.00) 13 (0.00) 38 (0.00)  1 (0.00)
 6068  36 ->  38 (0.90)  2 (0.01) 31 (0.01)  1 (0.01)  3 (0.01)
 8424  37 ->  35 (0.94)  1 (0.01)  2 (0.00) 14 (0.00) 10 (0.00)
 4088  28 ->  20 (0.93)  3 (0.01)  6 (0.00)  4 (0.00) 33 (0.00)
 8624  30 ->  10 (0.49) 12 (0.05) 38 (0.04)  1 (0.03) 13 (0.03)
11415  27 ->  29 (0.85)  1 (0.02)  2 (0.01) 13 (0.01) 10 (0.01)
 3714  19 ->  23 (0.94)  5 (0.01)  2 (0.00) 12 (0.00)  1 (0.00)
 3220  24 ->  18 (0.87) 38 (0.01)  5 (0.01) 10 (0.01) 12 (0.01)
 7081  30 ->  20 (0.82) 28 (0.05) 25 (0.03) 35 (0.02) 12 (0.01)
 6913  23 ->  21 (0.93)  4 (0.00) 30 (0.00)  2 (0.00)  1 (0.00)
 9509  27 ->  28 (0.88) 25 (0.01)  1 (0.01)  2 (0.01) 13 (0.01)
12434  16 ->  20 (0.94)  2 (0.01)  1 (0.00) 25 (0.00) 13 (0.00)
 5518  27 ->  24 (0.90) 30 (0.01)  3 (0.01)  2 (0.01) 12 (0.01)
 9434  23 ->  20 (0.93) 15 (0.01)  9 (0.00)  2 (0.00) 13 (0.00)
 9158  27 ->  28 (0.95)  2 (0.00)  1 (0.00)  4 (0.00)  3 (0.00)
 6509  27 ->  24 (0.90) 25 (0.02) 10 (0.01)  2 (0.01)  1 (0.01)
  310  19 ->  23 (0.94) 10 (0.01) 12 (0.00)  8 (0.00)  4 (0.00)
 5895  27 ->  21 (0.50) 24 (0.43) 42 (0.01) 30 (0.00) 12 (0.00)
 7553  27 ->  24 (0.91)  1 (0.01)  2 (0.01)  7 (0.01) 13 (0.00)
10398  21 ->  12 (0.92)  1 (0.01)  3 (0.01) 13 (0.00)  2 (0.00)
10878  27 ->  21 (0.69) 24 (0.24) 12 (0.01)  4 (0.01) 13 (0.00)
 7225  27 ->  24 (0.87) 21 (0.02) 30 (0.02)  1 (0.01) 12 (0.01)
 4562  27 ->  21 (0.94) 13 (0.00) 38 (0.00)  2 (0.00)  1 (0.00)
 7085  42 ->  41 (0.92)  2 (0.01) 12 (0.01) 17 (0.00) 13 (0.00)
 2847  27 ->  24 (0.92)  2 (0.01) 11 (0.01) 10 (0.01) 13 (0.00)
 7073  27 ->  21 (0.94)  2 (0.00)  1 (0.00) 13 (0.00) 12 (0.00)
 9880  39 ->  40 (0.77) 35 (0.08) 10 (0.01) 38 (0.01) 18 (0.01)
 2157  27 ->  24 (0.56) 21 (0.27) 29 (0.06) 23 (0.01)  1 (0.01)
 4437  20 ->  11 (0.94) 35 (0.01)  9 (0.01) 13 (0.00) 27 (0.00)
10845  39 ->  33 (0.92) 20 (0.02) 12 (0.01)  4 (0.00)  8 (0.00)
  496  30 ->  11 (0.97) 18 (0.00) 15 (0.00)  4 (0.00) 34 (0.00)
In [757]:
show_images(X_test, y_test, y_test_pred, cost_test_wrong, 49)
In [759]:
def show_low_cost_wrong_samples(data, label, sign_names, cost, result, score):
    index = [pair[0] for pair in cost[:8]]
    images_ = data[index]
    labels_ = label[index]
    result_ = np.array(result)[index]
    score_ = score[index]
    show_images_with_prop(images_, labels_, sign_names, result_, score_)
    
show_low_cost_wrong_samples(X_test, y_test, sign_names, cost_test_wrong, result_test, score_test)

cost_test_wrong.sort(key = lambda pair: pair[1], reverse = True)
show_low_cost_wrong_samples(X_test, y_test, sign_names, cost_test_wrong, result_test, score_test)

Question 9

If necessary, provide documentation for how an interface was built for your model to load and classify newly-acquired images.

Answer: inference(preprocessed_images, labels)

Parameters: preprocessed_images: numpy.ndarray with dimension as: [num, height, width, channels] preprocessed_images should be the data be processed by preprocess() That will zero mean and normalize the images that will be able to inferenced by trainned model. labels: numpy.ndarray of integers Array of label

Return: result: Array of boolean to represent the prediction result score: Array of propability (softmax) for all predictions, the dimension is: [num, categories] loss: The loss of prediction. accuracy: The accuracy of prediction.

Note: Once you have completed all of the code implementations and successfully answered each question above, you may finalize your work by exporting the iPython Notebook as an HTML document. You can do this by using the menu above and navigating to \n", "File -> Download as -> HTML (.html). Include the finished document along with this notebook as your submission.

In [ ]: